ATI All-In-Wonder X1900 PCIe Review 55
An anonymous reader writes "ViperLair is currently running a closer look at ATI's newly released All-In-Wonder X1900 PCIe graphics card. The clock speeds and memory are pretty comparable to other cards available but the reviewer warns that 'clock speeds do not always tell the whole story.' The review tests performance in Doom 3, UT 2004, Far Cry, Half-Life 2, and the 3DMark06 benchmarking tool." This release comes relatively quickly after the X1800 series which was release just last October.
I'm giving up on my All in Wonder (Score:4, Informative)
Re:I'm giving up on my All in Wonder (Score:1)
Re:I'm giving up on my All in Wonder (Score:3, Insightful)
Re:geforce (Score:2)
they have a brand, say it works fine, and call themselves 'fans', i find it interesting.
when i buy a card, i take a look at both ATI and NVIDIA, find the price range i want, and after going through reviews, buy the one that has the best FPS/$
Re:geforce (Score:4, Funny)
Re:geforce (Score:3)
Linux support and a companies reputation. ATI has made some lousy drivers and products in the past. However I find nvidia to be falling behind and both their linux and windows drivers are not stable. I had to downgrade to an older Windows HCL certified driver for my geforce 6600 for windows. Strange things kept happening. SuSE has a big warning on drivers too.
ATI now makes cards that are faster and have better effects and visuals than NVidia's while their drivers are improving. Even o
Re:geforce (Score:3, Insightful)
I only run Linux. I used to be a huge fan of ATI. I still have a Radeon 9800 Pro, and an Xpress 200 integrated chipset.
Both are a HUGE pain in Linux, compared to Nvidia. Huge. Gigantic.
Their drivers have improved, yes. But they still suck quite a bit. At least they usually compile/install correctly now, but performance is crappy.
Re:geforce (Score:2)
Have you tried any Cedega gaming? Do Cedega (Windows) games work properly?
Re:geforce (Score:2)
think about that (Score:4, Insightful)
if 200$ gets you 40 fps that's 50 cents a frame
if a 10$ card gets you 30 fps,
are you really ONLY gonna buy based on $/fps?
enjoy your 10$ card.
Re:think about that (Score:2)
off by a factor of ten (Score:1)
$500 (Score:1, Flamebait)
Re:$500 (Score:4, Insightful)
Re:$500 (Score:3, Informative)
Re:$500 (Score:3, Funny)
Well... if you have a dual-screen setup.
Re:$500 (Score:2)
Re:$500 (Score:2)
I just picked a fairly new game to illustrate my point. I don't think it matters if it's a single or multiplayer game...
Re:$500 (Score:1, Offtopic)
But if you want to know who the crazies are, it is
Re:$500 (Score:2)
That's pretty crazy! Doom3 was probably the last hardcore 3d game I've played, but wouldn't the xbox360 provide a better gaming experience for a lot less money?
Re:$500 (Score:1)
Re:$500 (Score:2)
Primarily useless benchmarks... (Score:5, Insightful)
Re:Primarily useless benchmarks... (Score:1)
here is a list i compiled by checking out many different benchmarks. in general the faster cards are on top, the slower ones below. since i am concentrating on affordable cards, i haven't placed many expensive cards above the nvidia 6600GT and radeon X1600XT, so there are many high-end ones available now that are not on this list. if you see a few cards back-to-back with an equal sign (=) in front, that means they are very similar in performance to the ones next to it that also have the "=" sign.
N/A = dis
Re:Primarily useless benchmarks... (Score:1)
Half Life 2 has been out for a year- there are tougher tests for a video card, like the Lost Coast expansion pack
Another ATI (Score:1, Flamebait)
In related news (Score:2)
Re:In related news (Score:1)
perhaps time for the older 1800 (Score:2)
Its like nothing is fast enough. After reading about the trillion or so polygons for unreal3 or whatever its going to be called, I need a new card. The graphics are stunning [unrealtechnology.com] and I wonder if even the x1900 will be able to handle it?
Re:perhaps time for the older 1800 (Score:1)
I also have a GF6600, with an Athlon 64 3200 (2.0ghz) CPU. I've heard that the major bottleneck for EQ2 is the somewhat low VRAM on the card (128MB), but also notice that the CPU is running at full capacity as well.
Any ideas? Upgrading either would cost roughly the same, and I want to make sure I pick the right one
I'm still waiting ... (Score:2, Interesting)
- Adam
I'm waiting on my PVR system (Score:2)
Re:Sorry to hijack this thread.. 2 monitors? (Score:1)
Re:Sorry to hijack this thread.. 2 monitors? (Score:1)
Re:Sorry to hijack this thread.. 2 monitors? (Score:1)
Re:Sorry to hijack this thread.. 2 monitors? (Score:1)
48 pixel pipelines (Score:2)
Very nice card, price is expensive, but nice.
Re:48 pixel pipelines (Score:5, Informative)
Re:48 pixel pipelines (Score:3)
All-In-Wonder Comparison
X1900 X1800 XL 2006 X800 XL X800 XTPCI Express
Yes Yes Yes Yes No
Core Clock
500 500 450 400 500
Memory Clock
480 500 400 490 500
Vertex Pipelines
8 8 2 6 6
Pixel Pipelines
48 16 4 16 16
Microtune Tuner
IC 2121 IC 2121 IC 2121 IC 2121 MT2050
Shader Model 3.0
Yes Yes Yes No No
Avivo, H.264 Acceleration
Yes Yes No No No
Re:48 pixel pipelines (Score:1)
Nice card, but sucks watts like theres no tomorrow (Score:1)
NVidia's cards used to be the ones that sucked the most watts and still weren't the best performers. Now it's ATI! Ugh... Fortunately NVidia's got the best Linux drivers.