Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

ATI Launches Crossfire... Finally 171

Steve from Hexus writes "After a long wait, ATI's multi-GPU solution - CrossFire - is finally here. Hexus checks out Crossfire using an X850 Crossfire setup, which can be beaten in performance by a single GeForce 7800 GTX in some games. Too little too late, or will R520 based Crossfire prove more fruitful? Hexus also examines how Crossfire works, how easy it is to setup and what its limitations are with current hardware." Looks very interesting - I'd love to get one for review.
This discussion has been archived. No new comments can be posted.

ATI Launches Crossfire... Finally

Comments Filter:
  • by HiroProtagonist ( 56728 ) on Monday September 26, 2005 @11:02AM (#13650726) Homepage
    You guys trying to kill Hexus today or what?
  • wahoo (Score:1, Funny)

    by Anonymous Coward
    Crossfire.... remember that board game when we were kids?

    http://www.letsgoretro.com/classic_games_crossfire .html [letsgoretro.com]
    • Re:wahoo (Score:1, Offtopic)

      by TheViffer ( 128272 )
      Sure do. Saddens me that they don't make those Lawn Dart anymore.
      • Re:wahoo (Score:1, Funny)

        by Anonymous Coward
        Now if you were to combine crossfire with lawn darts, then you'd have an interesting game.
  • Oh Boy... (Score:5, Funny)

    by jwilhelm ( 238084 ) on Monday September 26, 2005 @11:02AM (#13650729) Homepage Journal
    Happy "Abuse Hexus Day" everyone!
  • Argh! (Score:5, Insightful)

    by bassgoonist ( 876907 ) <{moc.liamg} {ta} {ecurb.m.noraa}> on Monday September 26, 2005 @11:03AM (#13650736) Journal
    Max res of 1600x1200 at 60hz...how...disapointing.
    • Yes, with the bigger displays getting cheaper by the day, a higher resolution than 1600x1200 at 60Hz would really be preferable. At least run it at 100Hz. 60Hz really strains your eyes.
      • Re:Resolution issues (Score:3, Informative)

        by ivan256 ( 17499 ) *
        60Hz is essentially the maximum you can achieve over DVI-D at 1600x1200.
        • Re:Resolution issues (Score:4, Informative)

          by prefect42 ( 141309 ) on Monday September 26, 2005 @11:37AM (#13651009)
          Take a peek at high end nVidia cards, and it's a different story. Dual-link DVI with both ports combined you can push it to 3840x2400. I believe that to be at 60Hz, but don't quote me. We've got a FX3000 running at that at 15Hz.
          • That sounds like the same story to me. Over a single DVI-D link, you can only push enough pixels to get 60Hz at 1600x1200. It has nothing to do with the card, it's the interconnect. If you want faster, you have to go to dual-link.
            • Re:Resolution issues (Score:3, Interesting)

              by prefect42 ( 141309 )
              Either dual-link or more literally as I mentioned we were doing: two connectors. They could just use both connectors from the non-Crossfire card to connect to the Crossfire card, giving you 1920x1200@100Hz.
        • I don't know why monitors don't just double scan things to reduce the flicker to make it bearable to view. I'd be happy with a 50Hz screen refresh if the monitor was displaying it at 100Hz, heck, when I'm not gaming a 25Hz refresh would probably be good enough if quadrupled in the monitor.

          I think cinemas do something similar where each frame is displayed twice such that the flicker is less noticable while the frame rate remains the same. Maybe it's more expensive/difficult for a CRT.
          • It's a good idea, but it would require something like an internal framebuffer inside the monitor, and delaying the video by at least a half of a frame. CRTs are mostly analog and very simple, so it would need a substantial amount of new hardware to implement.

            On the other hand, this means you could put all of this hardware in a seperate box outside the monitor and use it with any video source and any monitor (assuming you put enough memory and fast enough RAMDACs in the box). Sounds like an interesting proje
          • As always: price. They simply do what's cheaper...how do you think, why only expensive monitors give high frequencies at high resolutions?
      • I have a question though - is this true with LCD monitors as well? Every LCD monitor I have defaults to 60Hz so I leave it. I obviously always adjust my CRT displays, but I can't tell a difference with LCD.
        • LCDs look the same at 60Hz or 120Hz because LCD pixels have such a huge persistence of state. it takes a while for an LCD pixel to revert to neutral, but it takes almost no time for a CRT pixel. phosphorus stops glowing relatively quickly.
          • Re:Resolution issues (Score:4, Informative)

            by default luser ( 529332 ) on Monday September 26, 2005 @04:47PM (#13653510) Journal
            No. LCD pixels aren't just "more persistent" than CRT pixels.

            LCD pixels hold their current state until the input signal changes. There is no scan period on an LCD, the 60Hz signal is simply a convenient way to bridge the gap between raster scan displays and active-matrix displays.

            So, if you send an LCD a set of successive white screens, after the initial white screen no pixels will change, ever. A CRT, on the other hand, will write a white pixel to every part of the screen once every 1/60th of a second...and while the beam is not concentrated on a particular pixel, its brightness will fade.

            It has nothing to do with LCDs' slower reponse time, as you implied.
            • I did not say that LCDs have a slower response. I said that it takes them a long time to go neutral after being told what color to be.

              pick up your game boy, turn it on, and when a screen shows up, turn it off. the image does not stay there forever, it slowly goes away.

              that effect is what i was describing.
        • There's a gigantic difference between how each display actually 'displays' an image.

          In a CRT, Cathode Ray Tube, you have a phosphorescent screen that gets lit up by a ray gun. The gun is in the back of the monitor and fires forward, one line at a time (which is why larger monitors are generally much deeper in size, the gun has to be further back to paint the whole screen). Refresh rate is basically how fast this ray gun can draw lines across your screen in sequential order. Better refresh rates (75h
    • Re:Argh! (Score:5, Insightful)

      by Brain_Recall ( 868040 ) <brain_recall@ya[ ].com ['hoo' in gap]> on Monday September 26, 2005 @11:43AM (#13651043)
      The Inquirer [theinquirer.net] discussed this limitation before. The Crossfire system can do 1920x1200, but only at 52Hz. The SiI 1161 chip on the Crossfire card that merges the two data streams has this bandwidth limitation, and it appears ATi won't be fixing it for awhile.

      Personally, I feel the Crossfire solution has far too many drawbacks for the benifits. Not only do you require a special motherboard, but now you also need a special Crossfire capable video card. The second card can be any card, but the RAM buffer should be the same size, otherwise it will defualt to the lowest value for both cards. The external cable adds some nice external heft to the system, as well.

      So, what do you get over the SLI system? There are added antrisropic filtering methods and increased anti-ailising, but these are already appearing in the latest nVidia drivers. You can use your exisitng card to upgrade to a Crossfire system, but you can already do this with SLI. All in all, the system has it's flaws, too many I think to make it worthwhile.

    • Max res of 1600x1200 at 60hz...how...disapointing.

      Ha. I run everything at 800x600@75Hz on my 15" monitor. It actually works very well for me in most cases and makes everything look warmer IMO. I find it funny how you're whining about the lack of excess.

  • by digitalderbs ( 718388 ) on Monday September 26, 2005 @11:04AM (#13650751)
    Great! Now we'll only have to wait about two years for mediocre linux support.
  • by LegendOfLink ( 574790 ) on Monday September 26, 2005 @11:06AM (#13650760) Homepage
    ...for Jon Stewart. Folks say that man has been known to cease CrossFires that are just full of hot air. ATI had better deliver.
  • Coral Cache (Score:4, Informative)

    by Anonymous Coward on Monday September 26, 2005 @11:06AM (#13650766)
    Coral Cache [nyud.net]
  • Dammit (Score:5, Insightful)

    by MatD ( 895409 ) on Monday September 26, 2005 @11:07AM (#13650769)
    Why, why, why, can't the editors change the links to use coralcache ? It's retarded that every story on slashdot concerns an article that no one can read. Is it really any wonder that people post without RTFA?
    • Maybe because a huge percentage of the people who visit the site do so during work hours. And a huge percentage of those people are working for companies that block everything but port 80.
    • Re:Dammit (Score:5, Insightful)

      by Bronz ( 429622 ) on Monday September 26, 2005 @12:30PM (#13651447)
      It might be that part of the /. revenue stream is selling subscriptions to people who want to read stories before the servers melt. I'm not saying that's bad, I'm just saying it might partly answer your question.

    • Because a large part of their viewership is only able to view website that come across over port 80. Many, many corporate firewalls block ports like 8090, my firewall included.
  • ExtremeTech's Review (Score:5, Informative)

    by ThinSkin ( 851769 ) on Monday September 26, 2005 @11:08AM (#13650778)
    Under the hood, performance, compatability issues, SLI differences.....

    http://www.extremetech.com/article2/0,1697,1862962 ,00.asp [extremetech.com]

  • MGP (Score:5, Interesting)

    by Doc Ruby ( 173196 ) on Monday September 26, 2005 @11:09AM (#13650783) Homepage Journal
    Multiprocessing general-purpose apps on a GPU [gpgpu.org]?
    • It is possible to run some processing tasks on the GPU, something like a matrix inversion should be fairly quick, but general purpose apps aren't really viable due to the time/bandwidth it takes working between the GPU and CPU.
      • Er, did you click the link? That's an extensive website representing some of the General Purpose GPU work delivering apps right now. According to your feasibility criteria, general purpose Internet apps aren't possible due to client/server latency. Yet you're reading this message.
  • I just checked ati.com :)
  • Coral (Score:1, Redundant)

    by GweeDo ( 127172 )
    Those poor souls [nyud.net]
  • by Anonymous Coward
  • by PIPBoy3000 ( 619296 ) on Monday September 26, 2005 @11:13AM (#13650812)
    NVidia currently has a couple SLI cards [hardocp.com], which perform quite well. I recently picked up a 7800 GT, the low-end of the high-end cards, for around $350. The plan is to pick up a second one when the price drops to around $100. It's very reminiscent of my Voodoo 2 experience - the first cost $300 and the second cost $30.

    Of course, Crossfire has the benefit of working with any other ATI card past a certain point. With NVidia's offerings, you have to match the card exactly (though supposedly the manufacturer doesn't matter). For my needs, it doesn't matter all that much, but it's something to consider.

    Not that I'm a fanboy of either vendor. My last card was a Radeon 9800 Pro, which has worked great these last couple years. Now it seems that NVidia has the card that works best for my needs. Ain't competition grand?
    • by UnrefinedLayman ( 185512 ) on Monday September 26, 2005 @12:11PM (#13651302)
      The plan is to pick up a second one when the price drops to around $100.
      By the time you pick up a second one for $100 you're going to be a long, long way behind the curve. The GeForce 6800 GT (the previous generation equivalent of your card), released in June of 2004, still retails for $260-$280. Even the vanilla 6800s are ~$175. I would be surprised if you could buy a 9800 Pro for $100.

      SLI is a neat idea if the performance increase is tangible, but considering a single 7800 GTX can outperform an SLI setup of 6800 Ultras in many cases after only one year since the previous generation's release... by the time a 7800 GT is $100, it won't be worth $100 (much like all $100 video cards).
      • Like most things, it'll depend on issues such as cost, performance, and the timing of where I am in the upgrade process. Adding a second Voodoo 2 for $30 was great, as that's basically pocket change. Of course, by that time the computer was my secondary computer anyway. The boost in performance was nice, but not critical.

        I typically have two gaming computers at any one time. It doesn't have to be top-notch, but it's handy if it has reasonably good performance. When it stutters or is slow, my son mak
    • The plan is to pick up a second one when the price drops to around $100.

      This is sorf of the flaw in a lot of people's SLI plans. (Because Crossfire can use 1 non-crossfire compatible card ATI's solution might not be quite as bad.
      A 6800 GT is still about $275-300 right now, the 6800 Ultra is $350 and up.
      You can't hardly buy a 5900/U/5950 anymore.

      If you don't mind going to ebay or second hand cards for your second one you can probably save a decent amount of money on the second card, of course you already spe
    • Just curiosity, what do you need this big ass cards for? I can't think of anything except proffesional gaming.

  • Drivers... (Score:4, Interesting)

    by juiceCake ( 772608 ) on Monday September 26, 2005 @11:15AM (#13650823)
    I recently took the plunge and went back to ATI after hearing their drivers had much improved. After far too many VPU errors I ditched them again and went back to nVidia. Is it just me that has these problems? I wonder if the same driver issues will come up in the SLI cards.
    • i don't know about anyone else, but my AIW card had VPU errors left and right, so i went to a ATI9700 and it ran rather nice for a while and then for no reason one day all 3D functionality became extremely crippled. i tried everything, re-seating the card, hell i even dumped the hard drive and reinstalled windows, i gave up and found a GeForce 6800 128MB card on newegg for 168.00 and the rive tuner can unlock the extra pipes and vertex shaders to make it run like the GT version which sells for 230.00 on ne
  • by Namronorman ( 901664 ) on Monday September 26, 2005 @11:15AM (#13650825)
    It sounds like, even though Crossfire might not be the glorious thing everyone has been waiting for, that in the future it might prove better than SLI. I for one though, feel that it would be better to just wait for that one graphics card than to get two at the moment, considering how fast they become obsolete.

    Anyways, from what I've read and been told, SLI requires special profiles to be taken advantage of in games, while crossfire simulates 1 graphics card and doesn't require anything but the default drivers to be taken advantage of.
  • by Anonymous Coward on Monday September 26, 2005 @11:16AM (#13650839)
    Page 2 :

    "By definition, a single-link DVI connection only has enough bandwidth at its maximum clock rate to carry a 1600x1200 image at up to 60Hz, or a 1900x1200 image displayed at 54Hz. Therefore in terms of what the slave can send the master board for output via the compositing chip, it's limited to those resolutions."

    Limited only if you read the original DVI spec. How does he think people run the HP and Apple 23" displays and the Dell 24" display over a single-link connection?

    All card manufacturers, and 1920x1200 display manufacturers, allow you to run the channel with a reduced blanking interval, and so squeeze in the extra bandwidth needed for 1920x1200x60.

    Bad start to the review - I'm not going to continue reading (even if I could after it has been slashdot'ed.
    • Limited only if you read the original DVI spec. How does he think people run the HP and Apple 23" displays and the Dell 24" display over a single-link connection?

      The trouble is that the max resolution of the Crossfire solution isn't limited by the single DVI link - it's limited by the max bandwith that the compositing chip can deal with.

      So they were wrong to identify DVI as the source of the problem. And you have further muddied the waters by ranting about dual DVI as the solution. Hope this helps.
  • Sounds cool, (Score:3, Interesting)

    by hungrygrue ( 872970 ) on Monday September 26, 2005 @11:18AM (#13650854) Homepage
    but where are the Linux drivers? Doesn't matter how awesome the card is if we have to wait two years for drivers.
  • by LTC_Kilgore ( 889217 ) on Monday September 26, 2005 @11:27AM (#13650934)
    I feel that crossfire's biggest flaw is that there is a resolution limit at 1600x1200 @ 60Hz with crossfire enabld.

    The customers who ATI developed this product for (the most rabid and devout hardware addicts with large budgets) most likely have either large CRTs (FW900) or high resolution widescreen LCD's (2405FPW, etc).

    The failure to recognize that these customers would want to run games at their display's native resolution is unexcusable.

    Seriously, why elso would someone drop $1000 to upgrade their graphics hardware if it wasn't so they could run the latest games at high resolution with full detail settings.

    • it doesn't matter that much. not many people are going to buy a crossfire system anyways.

      why? because it sucks ;). that's also why it didn't matter that they didn't get it to market that fast - nobody would have needed it, only reason ati even has it is for PR and feature list.

    • If ATI really didn't put dual link DVI in the SLI card, then I'm disappointed.

      I'm a bit disappointed with the DVI standard though, I have a 9600-something attached to a 21" CRT, and that combination can run 2048x1536@80Hz over the analog connections. I do get a bit flicker weary though, so I don't stay in that mode for long.
      • You're bothered by the flicker of a monitor running at 80Hz? That's near the top of what a lot of middle-of-the-road monitors can handle at common resolutions (1280x1024). Ceratinly you don't notice a big difference between 80Hz and 85Hz.
  • by Anonymous Coward on Monday September 26, 2005 @11:33AM (#13650982)
    Looks very interesting - I'd love to get one for review.

    Just one?
  • by maskedavenger ( 674027 ) on Monday September 26, 2005 @11:35AM (#13650999)
    Two stories on here both from Hexus. Haha. What was the poster thinking?! He wanted hits but now the site is down!

    Crossfire will never work because who would buy a slave card... If you're primary card fails, you're out 2 cards! Ain't that some stuff!

    Oh, and I was an ATI man myself for years. Started before Radeon, now I have a 7800GT and I will never go back... unless they offer me some kind of deal, and buy SLI rights.
    • If you can get to the Coral Cache, read the fine article (and this is the printable single-screen version) [nyud.net] and see that the master-slave relationship puts an existing ATi card into 'slave' position. The Crossfire 'Master' card has an FPGA to unite the output of the Crossfire card and its slave. Then read about how the Crossfire features can be switched on and off without restarting your computer.

      Hexus rate Crossfire as a preferable method to link two cards than nVidia's SLI. However, the present executio
  • Looks very interesting - I'd love to get one for review.

    More like...

    This article looks very interesting - I'd love to see the review.
  • Meh... (Score:5, Insightful)

    by ivan256 ( 17499 ) * on Monday September 26, 2005 @11:39AM (#13651019)
    Who cares about this stuff other than the tiny portion of the population that will ever use it?

    The whole point of this SLI stuff is marketing. It convinces people to buy a more expensive video card than they otherwise would have so that they can fool themselves into thinking they'll get a huge performance boost a few years down the line when they add a second card on the cheap.

    In reality, when the second card comes down in price, the SLI configuration will be outclassed on the same order of magnitude as the single card alone by the latest stuff, and you'll just end up having to buy a whole new expensive card, or living with slow graphics.

    So unless you've got a boatload of cash and are going to buy two top of the line cards *right now*, it really doesn't matter if either of these manufacturers SLI technology is any good. It's just a marketing gimmick.
    • Re:Meh... (Score:3, Informative)

      by Matimus ( 598096 )
      SLI works well for game developers who are writing for next generation hardware. While I agree that there is a lot of gimmicky marketing involved, I think it is a good attempt to raise the bar on what it means to be 'High End' as a response to market demand. Maybe some people do what you suggest, but I know a handfull of people who bought two cards right off the bat. As long as people are willing to spend the money, you can't blame nVidia or ATI for giving them more opportunities to do just that. Also, I d
      • SLI works well for game developers who are writing for next generation hardware.

        Are you saying this from experience, or are you guessing? This doesn't seem accurate to me, since next generation hardware has additional features to go with the additional performance.

        • Works well != Works perfectly. Although I don't have game development experience personally, I have heard from the mouth of a representative for the team that is developing the new UT engine that they use SLI for this very purpose. Next generation features can be done with a software wrapper, but the extra graphics processing speed does help. If there is a better alternative I would like to hear it, and by better I mean cheaper/faster/quicker to implement.
          • If there is a better alternative I would like to hear it, and by better I mean cheaper/faster/quicker to implement.

            Use one card; interpolate the performance based on the specs of the next generation hardware.
        • Are you saying this from experience, or are you guessing? This doesn't seem accurate to me, since next generation hardware has additional features to go with the additional performance.

          From the 7800 GTX launch at Anandtech:
          As there has been no DirectX update since the last part, NVIDIA has opted not to introduce any extra features. Their reasoning is that developers are slow enough to adopt DirectX changes, let alone a feature that would only run using OpenGL extensions. (source) [anandtech.com]

          For the most part, a 6800 SL
  • by hexed_2050 ( 841538 ) on Monday September 26, 2005 @11:42AM (#13651040)
    There aren't any games that require a 7800 SLI configuration and probably won't be for some time. If NVIDIA wasn't worried about releasing a card to wipe out ATI in performance, they could of released the 7800 series in another year or two and everyone would of been happy as pigs in shh.

    If Radeon can offer their SLI combination at an affordable price, there's nothing stopping me from saving a few hundred dollers and purchasing a card that generally is in the same era that games are currently in.

    Sometimes it's not all about speed, nor price, but value.
    • by Soul-Burn666 ( 574119 ) on Monday September 26, 2005 @12:16PM (#13651340) Journal
      What about game developers developers developers? Graphic designers? Movie CG artists?

      The developers need the power to test now what will be standard next year.
      Graphic designers and CG artists need as much power as they can have, for previewing, since the final results are going to need considerably more power anyways, rendered in batch on clusters.
  • by Manip ( 656104 )
    This quote is taken from an interview about how they had to "Start Longhorn over" in early 2004. It has thus does not relate to Windows Vista (Longhorn's new name) because it contains none of that code base which was dumped.
  • Why won't ATI heed Jon Stewart's plea? Stop, stop, stop, stop hurting America!
  • " ATI Launches Crossfire... Finally "

    And NO Duke Nukem Forever Jokes???????
    well allow them to commence in:
    5,

    4,

    3,

    ...


    Oh yeah, this is a Duke Nukem post, that countdown never hits zero... :\
  • Actually... (Score:5, Funny)

    by CurbyKirby ( 306431 ) on Monday September 26, 2005 @01:07PM (#13651766) Homepage
    Looks very interesting - I'd love to get one for review.

    The point is to get TWO for review. =)
  • and were were told this was not a paper launch. we just got nforce 430 in however.

Remember to say hello to your bank teller.

Working...