Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware

Matrox Releases G400 Specs 99

Anonymous Coward writes "Matrox just released the hardware specs for the highly rated G400 here (must register). According to the guys working on the G200 driver (including John Carmack) from a driver perspective it is very similiar to the G200, so XFree86 and 3D support should be coming very soon. 1999 is going to be good year for Linux and high performance graphics. " With XFree86 4.0 out by the end of the year (hopefully), the Precision Insight news, and mandrake's work on xinerama, I'm inclined to agree. Now, if only I could get my hands on one of those babies...
This discussion has been archived. No new comments can be posted.

Matrox Releases G400 Specs

Comments Filter:
  • by Anonymous Coward
    It seems (to this newbie) that this card is rather keen on bump mapping. Skins with this feature would be rather swell. I recall the Nov. 98 PCGamer article on Q3a, though. The id folks mentioned that bump mapping would be useless eyecandy that killed performance in a DM game where nobody had time to appreciate it, so id were skipping it in Q3. However, Carmack just gave a little interview at Planetquake about his thoughts on supporting bump mapping on future hardware. He seemed more enthusiastic than before.

    Would anyone actually familiar with the technical hurdles care to speculate on the G400 fostering the inclusion of bumpmapping, complex lighting, and the like in Q3 or other imminent 3d shooters?
  • by Anonymous Coward
    It is always great to hear that more hardware support is forthcoming. However, I do worry about the present and future of OpenGL on Linux. You see, I just heard (can't remember where, but I hear it over and over again) from a highly respectable but totally obnoxious source that...

    Mesa in Bombad Troubles.

    Maybe this was taken out of context.;-)
  • Huh? the 3d part is most definately open-source. Where do you get your information?

    Nvidia didnt give out specs though, so it's pretty hard to improve on the driver... they are looking into giving developers access to the docs though.
  • Of course... management has said there is no chance of ever getting WARP specs, although we are working on a scheme to support this in an open-source driver by treating the microcode as a chunk of data that is loaded into the card when the driver is loaded and not actual code.
  • the TNT doesnt have any sort of triangle setup engine, if they did, they probably wouldnt release the specs on it. the also didnt release specs to anything, only code, which isn't exactly easy to debug if you find an error..

    With matrox, we will have a fully open-source driver with nearly fully available specs, minus the WARP.
  • BeOS is the as bad as windows NT. It's proprietary, therefore doomed for failure.
  • I wish I knew.... it's hard to think like a corporation.

    We amazed everyone with the speed at which we developed the G200 drivers, if we had WARP just imagine what we could do.
  • All decryption is done in hardware that I know if. we should see an open-source driver for it sometime.
  • You dont know that, they could have written it in a hex editor.
  • What? The 3d portion of the NVidia drivers is checked into the same source tree as the driver for G200.
  • Well, you've said this earlier, and I've answered this earlier as well, but since this message was moderated up... http://lists.openprojects.net/pipermail/g200-dev/1 999-June/001323.html

    But the current bottleneck in the G200 driver is not the lack of documentation about the WARP microcode, rather the lack of direct rendering.

    /Andreas
  • My take on this is that Matrox hardware people are very good, but matrox software people are still learning how to make fast OpenGL drivers.

    /Andreas
  • The WARP specs probably contains a lot of information about the G200 architecture. This is microcode we are talking about, not some high level language. A competitor trying to reverse engineer their chipset would be at a distinct advantage I guess.
    But, as Stephen wrote, Matrox has stated that they are going to help us with the WARP.
    Look at the following URL for the original mail from matrox developer relations:
    http://lists.openprojects.net/pipermail/g200-dev /1999-June/001323.html


    /Andreas
  • According to the following URL, the TNT has a hardwired register based setup engine:
    http://lists.openprojects.net/pipermail/g200-dev /1999-June/001223.html

    You can also look at servGL/hwglx/nv/riva_prim.c
    Much less to do than in mga_tritemp.h as far as I can see.

    /Andreas
  • Sure, Nvidia released "already-written GPLed drivers", but a) they aren't fully finished, b) they didn't send the specs along with the drivers, and c) the Nvidia drivers were based on ... the same GLX source base as the G200 drivers. Surprise.

    I think it is more correct to say that the G200 driver is based on the same source tree as the nVidia driver. David Schmenk at nVidia is responsible for the hardware procedure hooks in the GLX source tree. (Which btw shows that nVidia started this long before the Riva Enlightenment petition started)

    btw, nVidia's driver aren't GPLed. They are under a XFree86 compatible copyright.

    /Andreas
  • This is a rough timeline:

    Original GLX module.

    Thomas Götz adds support for Matrox Millenium chipsets to the GLX module

    David Schmenk adds hooks to the GLX module to simplify hardware acceleration.

    Wittawat Yamwong adds support for G200 to the GLX module.

    The nVidia driver is released to the public

    Future: Integration with Precision Insight's DRI. (This will use the GLX implementation from SGI).

  • hmm.. perhaps I should have included this info from Terence's page (http://reality.sgi.com/ripperda_engr/glx/):
    I'd like to send a hearty congratulations and thank you to Dave Schmenk at NVIDIA for his hard work. I don't think people realize how long he's worked for this, or how he had already helped the linux cause. The G200 driver descended from his templates and work on the TNT driver.
  • Matrox has stated that they are going to help us with the WARP engine. Take a look at the following URL: http://lists.openprojects.net/pipermail/g200-dev/1 999-June/001323.html
    besides, using the WARP engine will not be really helpful until we are using a direct rendering approach.

    /Andreas
  • by vipw ( 228 )
    they are freer than gpl, it don't get better than that, gpl is too restrictive licence, hence it isn't used by Xfree
  • Okay, we've got at least two really good 3d accelerators here, or coming in the near future. So do I get a TNT2 or a G400? :)

    Both seem to have open specs and good performance..... so once my riva128 gets replaced, what to replace it with....

    And while we're at it, wonder if we can force certain sound companies to giving us some specs.... (take a hint please, creative.... people will still buy your cards if they know how they work. :) )
  • You're kidding, right? Actually, I believe you, but I did the math for that resolution, and at a decent refresh rate (80Hz), you need something like 1200MB/s bandwidth and lotsa lotsa memory for one screen image.
  • Posted by Moritz Moeller - Herrmann:

    >It has some nice specs and numbers. Don't count >on any Matrox drivers though. The G200 is now >outdated and there still isn't a decent OpenGL >driver available. It'll be a cold day in hell >before I buy another Matrox product.
    So have they released all the SPECS? FOr 3d, too?
  • Posted by Dr Evil:

    released 3D specs, yes. Have not checked yet to see if they have released ALL of their specs (the released G200 and Mystique docs miss very important speed-related register sets, like the G200's WARP engine which does triangle setup)

    -David
  • If Matrox hasn't released the specs for the G400's triangle setup engine (They never did for the G200), this is just a marketing ploy. Buy a TNT.

    By releasing already-written GPLed drivers, NVidia jumped to the top of the pack. Esp. since they actually have decent Windows drivers if there are any games you still need to reboot for.

    Short of releasing a faster card (Is the G400 faster than the TNT2?) AND releasing full specs for the card, Matrox is dead in the water.

    (Note: Not long ago I was a Matrox fan and an NVidia hater. Then Matrox kept on slipping on drivers, and I got annoyed.)
  • Matrox refuses to release the specs to their triangle setup engine, which is a key portion of the card for decent performance.

    NVidia has released a full-blown GPLed driver. Instead of releasing docs, they actually went to the effort of writing a driver.

    And don't forget that Matrox doesn't give a damn about OpenGL gamers under Windows either.
  • For the uninformed (like I was until I looked it up), Xinerama is a wide screen server that combines two or more screens into a single virtual screen. I'm guessing this would be really good for video walls and other similar uses.

    About the G400 specs, awesome. I wrote to Matrox about this, and I guess this was their reply. :^)

    -- Does Rain Man use the Autistic License for his software?
  • I hear ya. Although I like the performance of my G200 for 2D X stuff, Matrox hasn't really taken the Linux/BSD user community seriously, and apparently refuses to release *all* the specs.

    I'm not a big gamer, but heck, I'd like to try out Q3 just like everybody else... :-P

    I'll probably buy a TNT card instead of another Matrox.

    --
    Get your fresh, hot kernels right here [kernel.org]!

  • It's even worse...it's single user!

    --
    Get your fresh, hot kernels right here [kernel.org]!
  • Already got that module. Slow and kinda buggy (at least as of a few weeks ago). It's got a long way to go to be usable. Not that I'm complaining...I'm glad that the people working on it are doing so. I'm just unhappy that Matrox didn't write a Linux driver like nVidia.

    --
    Get your fresh, hot kernels right here [kernel.org]!
  • I currently have a Millenium G200 w/ the Rainbow Runner attachment. My idea was to have a decently fast video board that I could do video capture with. So far, I've only really done 2D stuff with the board, and it's darn fast for that.

    The video board is only useful in Windows, unfortunately. So here's what I'm wondering:

    Does anyone know of plans to support the Rainbow Runner cards in the future? (I don't know if specs are released, otherwise I would know the answer.)

    I haven't really found a need for 3d acceleration or OpenGL stuff under Linux yet, so I'd be tempted to purchase the Metro Link OpenGL w/ MetroX stuff before buying a new card. They're about the same price, and I wouldn't need to do hardware swaps again. (Plus, I already use MetroX, so it's not that big a jump.)
  • as is mentioned earlier matrox will not release
    spec for the triangle set up. nvidia gives us
    everything, and even wrote the driver under a
    free licence. this ones real hard to figure out...
  • I think pricing is OK. Check it at:

    www.pricewatch.com

    Unless there is a supply shortage like with the Vodoo3 3500's, you will see the price dropping every three or four days!

    ________________________________________________ _____
  • see this image [matroxusers.com]. It's from Quake 3, with all the effects on. btw it's 615Kb JPG...
  • Sometimes there is no source code for microcode. Certain devices -- especially custom, specialized ones -- don't even have an assembler let alone high-level language compilers. In this case, the microcode is probably scribbled down on a napkin or a notebook by an engineer and then entered with a hex editor.

    In the case of the X11 license, source code for modifications isn't even required anyway so this argument is moot anyway.

    For GPL licensed stuff, the source code for a work means the preferred form of the work for making modifications to it. If the microcode was originally authored with a hex editor, then an octet stream is the preferred form.

  • ...that will probably make the decision for me. The G400 is a "high end" card. It will have TV out, dual head support on the card, and all sorts of whiz-bang features for DVD's and MPEG's. Most of those things I don't care about, so I'll probably just save my money and go with the more conservative TNT2. If you feel like going all out, though, the G400 is for you.

    --Lenny
  • > Mesa in Bombad Troubles.

    *chortle*.

    --Rob

    Comics:
    Sluggy.com [sluggy.com] - It rocks my nads.
  • That's exactly the conclusion I came to, also.

    Let's see if we can make better drivers than they can. :)

    John
  • Yes, the G400 is quite a bit faster than TNT2 or Voodoo3, at 32b color depth and high resolutions. At lower resolutions, it doesn't do as well as those cards.

    But if you can play Q3 smoothly at 1280x1024, why go back to 800x600? :)

    My reference is the recent Q3T benchmarks of various video cards by Id. The original data can be found at Id [idsoftware.com], and here's [shugashack.com] some analysis done at Shugashack. (I don't completely agree with their analysis, but I might be confused about an issue or two. Like all analysis on the web, use it as background, but make up your own mind when you have the data. The G400 is definately damn fast on high res, high quality settings, though.)

    John
  • by zosima ( 8652 )
    Sure, and I thought my G200 was awesome. With 3d support booming in linux and the two headed display coming out with the next Xfree...this will be the card to have. Did anyone check out the max resolution? 2048 x 1536 @32bpp!!!
  • Okay, I am not a big fan of people complaining about moderation [fear what you hate, you may easily become it] but still. . . why??? I expressed an opionion and then gave some new information for people who didn't want to delve into the site. I found my info at http://www.matrox.com/ mga/g400/technical/glossary/feature2.htm [matrox.com], if someone thinks I am lieing. . .
  • I guess everyone missed the opening of the G200 3d specs a while back -- in any case, there are GLX drivers for the G200 under Linux. Now. Go to http://www.on.openprojects.net/glx [openprojects.net] and grab a binary. Watch the gears screensaver go *real* fast, or heck -- leave it running in the background on your desktop. :) In any case, you can certainly try out Q3test with your G200. (In fact, under Linux a G200 is faster than a TNT(2) right now because of driver maturity. Go figure.)
  • Another option is to get an (admittedly dated) G200 board. The specs are open, the driver under Linux is rapidly maturing, and hey ... you even get John Carmack hacking on the card drivers. :) I found one refurbished for $30 (no joke), and it works like a charm. It has all the 3d power I need right now, so ... me, I'm happy.
  • For those who weren't looking, Matrox released the G200 *specs* (minus the WARP setup engine, but more on that in a sec) a while back. After a few months of hacking by notables including John Carmack himself :), the G200 is pretty well supported in 3d under Linux.

    1) I'm happy you bought an Nvidia card, and the release of a TNT(2)/Riva128 driver is a good thing. However, without the specs to the card, it's difficult to do major optimization and rework of the driver. The consequence of this is that right now the G200 (and that's 200, not 400 or 400MAX) is faster in 3d under Linux than *any* Nvidia card, including the TNT2 Ultra. It's a consequence of not being able to really re-work the drivers, plus the fact that the drivers haven't been out as long. I have no doubt that this will change in the near future :), but for the moment the lack of specs seems to hinder the Riva driver development. (My theory on why there are no specs is that ... there simply are no easily-grok'd designed-for-a-third-party-to-write-a-driver specs, *anywhere* (including internally at Nvidia). Likely that Nvidia wrote a sample driver and gives that out with the chips for clients to customize ... I'm doubting they send the chips out with a programming manual. Just like any good programmer, Nvidia has left comprehensive chip programming documentation as the last thing to do. =)

    2) The G200/G400/G400MAX isn't even limited by the lack of the triangle setup engine. Having the WARP specs would give us a 25% increase in performance, assuming we could even saturate the card now. (But we can't, we only got asynchronous DMA in the last few days.) Soon -- as in, when we get a direct rendering interface in place -- the lack of the WARP will be a problem. But for now, it's not. And Matrox has committed to helping the open-source GLX drivers utilize the WARP, without releasing the specs on the device -- and this is fine. It would be a bit like Adaptec putting a developer on the task of helping the kernel SCSI guys develop a driver, without releasing all the details of chip operations -- the company does some legwork, and tells us how to integrate that proprietary microcode into our open-source driver. And thus, almost everyone is happy.

    3) Chill. Stop bashing Matrox. They are so far the most progressive 3d hardware company out there, at least in terms of releasing programming information. Nvidia is right behind them, and that's a great thing -- neither chip is encumbered with something like Glide, and both give good performance. This is a choice between two *good* options, not a win/lose proposition.
  • Oh, my bad. There were some Millenium II GLX drivers floating around out on the 'net that formed some of the basis of the G200 code, so I assumed those had either started from scratch or been built from SGI GLX code. I knew the two drivers shared a common heritage, but hmmm ... interesting to know that the Riva driver was the original. I wonder how the G200-dev folks got the Riva-less GLX code with hooks ... I'm thinking Terrence Ripperda is the likely suspect. :) (And, AFAICT, the G200 drivers are also under an XFree license, for easy integration into the XFree DRI source later this summer.)
  • > So, resolution: goes to Matrox
    > Speed: 3D speed goes to TNT2,2D I'll bet goes to Matrox.

    The AC posting above this is pretty much right on about the 2D and 3D strengths/weaknesses of the two cards. I would only add that benchmarking so far has shown the G400 (and especially the G400MAX) to be very CPU-dependent. If you have a slower, older CPU, then the TNT2 will wax the G400. If you have something like a P3-500, it's a much more even competition -- and at high resolutions/bit depths the G400 starts pulling ahead. Oddly, I would vote the G400/G400MAX as more "future proof" but in the fast-changing world of 3d accelerators I doubt that's worth much. :)

    Again, under "normal" circumstances, the TNT-series of cards has much better driver support, etc. (I'm speaking of the situation under Windows.) However, if Linux 3D performance is your thing, my bet goes on Matrox for the best (OpenGL, ironically :) support under Linux, at least until/unless Nvidia puts together a publicly grokable specs booklet and releases it. It's rather ironic that pretty soon the OpenGL support for G2/400 under Linux will be better than it is under Windows.
  • I don't know of any plans in particular to support the Rainbow Runner -- I've been following the GLX development, mostly. However, I wouldn't be surprised if someone made a Video4Linux driver for the Rainbow Runner. Full-speed MJPEG capture is a pretty nice add-on, if you ask me. I suspect that since Matrox gave us the specs on the 3d engine, the specs on the video capture board won't be that much of a problem.

    Your last paragraph confuses me, though: why buy Metro OpenGL under Linux? There are 3d drivers *now* that support your G200 under Linux (using Mesa), and do it pretty well. Besides, doesn't Metro's OpenGL support only extend to the Permedia cards right now?
  • by Anderson ( 8807 ) on Saturday June 19, 1999 @08:50PM (#1843116)
    Hang on a sec. I think calling this a "marketing ploy" is going a little over the line. Two points about the triangle setup engine: it's not necessary to have the WARP triangle setup specs in order to get good 3d acceleration. You still get the rendering engine, after all, and it ain't half bad. But regardless, the WARP gives you about a 25% performance boost (J. Carmack's estimate, not mine), and Matrox has committed to helping the open-source driver developers use it. Considering their track record on promises to open-source folks, I'd say they're serious. On this one, let's wait a month or two and see what happens with the WARP -- my feeling (and the attitude on the GLX development list) is that Matrox will probably do what they say.

    Sure, Nvidia released "already-written GPLed drivers", but a) they aren't fully finished, b) they didn't send the specs along with the drivers, and c) the Nvidia drivers were based on ... the same GLX source base as the G200 drivers. Surprise.

    As to your note on Matrox needing to have a faster card *and* releasing full specs ... well, I don't want to get into a pissing contest here. But the result of having the card specs is that the *G200* is faster than any TNT(2)(Ultra) card under Linux right now -- the drivers are just better. So, if your only metric is Linux 3d (OpenGL) speed, then I would guess Matrox is a-okay: they've opened the specs on everything we could have asked for (as a public company, they would likely be liable and be sued for releasing specs on something as proprietary as the WARP ... I'm amazed they were able to release the G2/400 specs, personally), and have committed to help us (which, notably, is the same response we have from Nvidia -- code and a committment to continue helping) on the small remaining parts. Furthermore, their G200 (not to mention the forthcoming G400) is arguably the fastest Linux 2d/3d combo accelerator at present. (The Voodoo2's still beat up on it in pure 3d performance.) I'm not running off to buy Matrox stock or anything, but in regards to their open-source community standings, I would say they're doing all the right things.

    Seeing this message and your other anti-Matrox message, it looks like you've made the full transition from Matrox fan/Nvidia hater to Matrox hater/Nvidia fan. Might I suggest reserving religious commentary for something other than graphics cards? :) There are better things to get worked up about -- both of these companies have helped out 3D under Linux *tremendously*. Bashing Matrox here doesn't do anyone any good.
  • > By releasing already-written GPLed drivers, NVidia jumped to the top of the pack. Esp. since they actually have decent Windows drivers if there are any games you still need to reboot for.

    The driver isn't GPLed. It's an X-like license:

    Users and possessors of this source code are hereby granted a nonexclusive, royalty-free copyright license to use this code in individual and commercial software.
  • Instead of releasing docs, they actually went to the effort of writing a driver.

    What good is that? "We release this driver, but since noone but us get at the docs, fat chance any of you guys can patch the code anyways."

    And don't forget that Matrox doesn't give a damn about OpenGL gamers under Windows either.

    Well, they have a beta driver, though it doesn't work very well. Their Direct3D support, though, is way up there, and more WinDOS games use D3D than use OpenGL.

  • XFree's licence is not specific about this case, but I know for a fact that you can compile microcode into the Linux kernel which we do not have the source for, for devices like sound cards and scsi controllers which run proprietary microcode.

    This is a grey area which hasn't specifically been addressed by the various "open-source" licenses.
  • What would you do with the specs (if they even exist in any readable state) anyway? its microcode! Are you going to hand code this stuff? They would have to release a whole development sweet of software. This would cost them far to much money.

    The Video Card market is very cut-throte they dont have the spare cash to throw at things like that -- even if it is good for them in the long run, they could go out of business before it pays off.

    I think their support for linux for has been excelent especially to publish the G400 specs this soon. All we want is the warp code data to download to the card just like the SCSI drivers do.

    where are the specs for the TNT? You may have the software but its not the same as the specs, far from it.

    ---

  • It looks like the moderators here are anti matrox (maybe they work for the oposition?) Why is this informative?

    Later down the list someone gives some good info on the card -- which gets lots of follow ups -- and it gets marked redundant? (I bet this get moderated down too:)

    ---

  • NVidia's choice to release the source to the drivers (and from the FAQ, it sounds like they plan to continue to support linux, and well at that) drove me to buy a TNT2, but dammit if I didn't want the G400MAX even then! Gar, had I only waited.
  • Ok, I'm sure Crow (I'm assuming you're Stephen Crowely [sp?], unless I've been confused for a long time) has adequately explained this, but I'll chime in. People keep complaining about the WARP specs not being there. It doesn't really matter yet anyway. Besides, this is a bunch of people who asked for specs and got some of them. Matrox is willing to work with them. Likewise, NVidia has released source but no specs. Both have made great contributions but haven't gone all the way. Neither seems like that much of a publicity stunt, just another way of keeping/getting customers. Also, some people seem to think that support for the G[24]00 cards comes at the expense of support for the NVidia cards, but that isn't so. They share a lot of source, from what I can tell, and not all the developments the G200 development people are making are specific to Matrox products. GLX has a bit of a ways to go, and they're making it go. I've been on the G200 dev. mailing list for all but the first week it existed. It's been very interesting. I've learned a lot. I was very shocked when I noticed that first message from John Carmack! I've since moved to digest form (I'm not contributing anything anyway!), but I still get it, and it's still interesting. Everything is progressing along fine, and there really isn't any bad news at all, except for the impatient. And for the license-complainers talking about microcode and the WARP engine, someone on the g200 list has repeatedly pointed out that the Linux kernel contains microcode, and that is quite GPL'd.

  • This is terrific news, from any perspective. I'm not a gamer, but can't help drooling over the G400MAX! I hope that the drivers, once they appear, will support multi-head and composite video, as well.
  • Because a company does not embrace Stallman's philosophy does not make it bad, nor does it mean their products are doomed to failure.

    The Linux growth curve is impressive, but no one is seriously anticipating the demise of NT. It would be more likely to see Novell collapse, and no one seems to be counting the days on that one, either.

    Save glib fervor for religious issues.
  • Matrox conveniently left the specs for their triangle setup engine out of their g200 spec release. Has something similar been done here?
  • It's just data. It doesn't run on the host CPU at all.
  • The G400 16mb version: $150
    G400 32mb version: 200
    G400 MAX (faster 32mb version) 250..

    I just ordered a max a few days ago (they've been taking online orders for a little while, it'll take the boards about 3-4 weeks to get there, unless it's a max, which will take 4-5 weeks.)

    They look great, and we'll see how it all turns out on the 21" monitor, eh?

    -ehfisher
  • I've seen about a 10% increase in the fps reported by gears over the last 3 weeks or so. If you check in on their list archive now and then, it's plain that they're pursuing things aggressively.

    And John Carmack is posting so regularly that he seems to be part of the team.

  • And without specs released, can we (the members of the GNU movement) take a look at NVidea's driver and say 'Hey, this could be written better in parts!'? No, I don't think we can. Neither Matrox or Nvidea have done enough.

    A driver is nice, and a GPL'ed driver is better, but specs would be EVEN better.

    Specs are nice, but full specs would be EVEN better.

    And shut the fsck up about Matrox and GL in Win9x. They shipped a driver with the G400. The only reason that Nvidea had a good driver to ship with the TNT/TNT2's is that they first had the Riva128 to experiment with. Did you ever try the original OpenGL drivers on the Riva128? Simply put, an OpenGL driver is a BIG thing to do. Don't slam Matrox, they tried and eventually delivered.
  • is matrox going to release the specs or a driver for the dvd decoder addon for the g400?
  • What are they so concerned about? Are they afraid that someone might actually improve their driver?

    Criminy, people - do you /really/ want to learn how to program and optimise a propietry ASIC? How would this benefit Linux/*BSD development?

    The WARP spec issue is /no/ different to the microcode blocks included for some Adaptec drivers, which nobody complaions about.

  • ..after reading all you comments. Thanks!
    But the question still remains, which card
    should I buy? I want tv-out.
    And are the drivers at nVidia working with
    all card-versions using TNT2?
  • I think Be looked ahead on this part.. they have some of the underlying support already in for multi-user.. it wouldn't be as hard to do as, for example, windows 95..

  • Be has stated that they are working on a G400 driver even as we speak. It isn't in the new R4.5, but it probably will be released on the Web before the next version of BeOS. So, never fear, Be is on top of things as usual! :)

    Regards,

    Jared
  • Funny... I own one since January.
  • G400 (the dual-head ones) have TV-Out
    from day one. No need to wait for that.
  • > Slow and kinda buggy (at least as of a few weeks ago).

    Try it out again... This project has developed a breathtaking development speed!
  • I am *very* pleasantly surprised to see the G400 chip specifications availability. With the full G200 specs taking quite a while to emerge, I was quite sure that the same delay would occur with the G400.

    But the specs came out! And not only that, they're out before the video card is even on the shelves! I'm guessing that either the specs documentation is a little easier this time around, and/or Matrox is really trying the one-upmanship battle with NVidia (judging by NVidia's recent Open Source Linux drivers).

    That's what great about getting hardware vendors to support Linux--you only need to convince one or two. Once that happens, the other companies will do it for fear of losing competitive advantage.
  • by Stickerboy ( 61554 ) on Friday June 18, 1999 @10:58PM (#1843142) Homepage
    The biggest hurdle to implementing bump mapping is artistic. To get the best effect from textures in hardware that doesn't support bump mapping, the light effects must be drawn in by hand in the texture (i.e. drawing the shadows on a brick wall). In a bump mapping-capable system, you want the hardware to do that, so the textures are drawn differently. Supporting both types of systems effectively doubles the workload of the texture artists, which is one of the main reasons why bump mapping isn't in q3a.

    Another reason is the non-negligible 20%+ performance hit from enabling env. map bump mapping, as displayed by the Matrox G400. When a Ultra TNT2 is having trouble reaching 60 fps on "Highest Quality" in q3a, it doesn't make sense to devote the extra effort for a feature very few will use. As it is, q3a engine licensees are supposed to be able to easily enable bump mapping in the engine itself, for future games. (I can't remember which id .plan that was in)
  • I would wait if I were you and get the g400. Not only does it appear to have better colors than both the tnt 2 and v3 it is also faster. Matrox is supposed to be releasing a tv out version in a few months (maybe more or less) after the g400 comes out.

After an instrument has been assembled, extra components will be found on the bench.

Working...