Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia Launches New Affordable GPU 321

mikemuch writes "Today Nvidia unveiled a new low-cost, high-power graphics processor SKU. ExtremeTech's Jason Cross has done all the benchmarking, and concludes ' This makes for an impressive bargain and a huge step up from the generic GeForce 6800. The big question: How will this fare against ATI's similarly priced X1000 series card, the Radeon X1600 XT?'"
This discussion has been archived. No new comments can be posted.

Nvidia Launches New Affordable GPU

Comments Filter:
  • Tech Report Review (Score:3, Informative)

    by hattig ( 47930 ) on Monday November 07, 2005 @12:45PM (#13970873) Journal
    Pretty decent review here I read earlier:

    nVidia 6800GS [techreport.com]
  • by fuzzy12345 ( 745891 ) on Monday November 07, 2005 @12:48PM (#13970929)
    It's been some time since we last ran our last GPU Price-Performance shootout. Despite nine months having passed, not a whole lot has changed the landscape.

    The real sweet spot for graphics is in the $250 to $300 price range.

    We have no idea what the heck is going on here.

    The big question: How will this fare against ATI's similarly priced X1000 series card, the Radeon X1600 XT? In short, we don't know.

    • by ruiner5000 ( 241452 ) on Monday November 07, 2005 @01:25PM (#13971388) Homepage
      Yeah, Extremetech is after all a big tech publishers attempt at a tech enthusiast site. If you are in the $250-$300 range then you should spend $33 extra bucks and go with this evga 7800GT. [dealtime.com] It is worth the extra chunk of change. Not only will it be much faster than the cards that Extremetech recommends, but it also uses less power than the 6800GT, and therefor puts off less heat. That is a no brainer in my book.
    • by aywwts4 ( 610966 ) on Monday November 07, 2005 @01:29PM (#13971445)
      I think this shows everything thats wrong with the tech review industry. They Adver-review cards pretty much only for kids to drool over and feel bad about their existing card that works just fine on pretty much every game they play; and 'Enthusiasts" IE, one born every minute.

      Instead of working as a consumer reports type site, where If i want to buy a good graphics card for my ~700-1100 dollar computer (Not my 4 grand alienware) I would be digging through archaic reviews from a few years ago with test results on old drivers.

      Wow, this just in, a 700 dollar card dual SLI card can play games at resolutions larger than my monitor can handle, at colour depths the human eye can't discern, at a framerate so fast the human eye doesnt pick it up, on a game that probably wasn't made to take advantage of the card, and with an actual visual performance increase I can barely notice. But the good news is I smoke em when I run a benchmark utility.
      • Yeah, no kidding. I am *very* used to running games at moderate detail and with all of the AA/AF turned off as I have Jurassic-era equipment by gamer standards (P4-M 2.2 GHz running a AGP 4x Radeon 9000 64MB.) It does not make that big of a difference to me anyway as it looks nice, but in a FPS game, do you just stand there admiring the scenery? No! You run around and shoot the bad guys.

        And I laugh any time I see people doing CPU framerate comparisons at 640x480 or 800x600 with everything dialed down and j
        • TV is 50-60fps (though it is interlaced nterlaced). The other advantage TV and Movies have to help avoid flicker is the motion blur that they get for free. Video cards render each frame as if the stuff were frozen in place and a picture were taken.
      • Are you a gamer? If so what do you play?
      • Wow, this just in, a 700 dollar card dual SLI card can play games at resolutions larger than my monitor can handle, at colour depths the human eye can't discern, at a framerate so fast the human eye doesnt pick it up, on a game that probably wasn't made to take advantage of the card, and with an actual visual performance increase I can barely notice. But the good news is I smoke em when I run a benchmark utility.

        You can notice 4xAA and 8xAF turned on both visually and framerate. When everyone is running 192
      • I've been tracking video card reviews for years. Typically the performance of a GPU doesn't change much subsequent to its introduction. What would be the value in doing a subsequent review?

        Most of the top review sites keep a generation or two of older chips in their comparisons. Some even compile regular guides on value and midstream priced parts. If you can't find information on cheaper video cards then you aren't looking hard enough.
    • my sweet spot pricewise for a graphics card is $30... not a penny more...
  • by amcdiarmid ( 856796 ) <amcdiarmNO@SPAMgmail.com> on Monday November 07, 2005 @12:49PM (#13970937) Journal
    http://theinquirer.net/?article=27493 [theinquirer.net]

    Nice of them to cut the price. I would like them to keep the SKU so I didn't have to keep up with anotherone: Although I suppose if they hadn't rebadged it, everyone who bought the 6800 would be pissed at the price cut.
    • Although I suppose if they hadn't rebadged it, everyone who bought the 6800 would be pissed at the price cut.

      Isn't that what happens with technology... prices go down? I got a 6800 for Christmas last year, a black friday CompUSA deal for 200 bucks after rebate... By this time, I'd almost expect it to be down around 100 bucks.

      Also, on another topic, on some of these cards you can use RivaTuner to unlock the extra pipes and pixel shader, too... great if it works, but of course it's not guaranteed. Mine,

    • It's not simply a 'rebadged' card. Not only did they bump the clock speeds from the 6800's 325MHz core and 700MHz memory to 425MHz core and 1000MHz memory, they also switched from DDR to DDR3 memory to achieve the new memory clocks. This is as much of a difference as there is between the 6600 and the 6600GT.

      It's not so much of a price cut on the 6800GT as it is an clock-speed (and price) boost to the vanilla 6800 that brings its performance to the same level as the 6800GT while still keeping a lower price
  • This is insanse (Score:4, Insightful)

    by PoderOmega ( 677170 ) on Monday November 07, 2005 @12:50PM (#13970953)
    We are often asked "Which video card should I buy?" We always answer with "well how much do you want to spend?" The inevitable reply is that everyone wants to run all the latest graphics-heavy games at high resolutions with all the features enabled, but they only want to spend $100 to $150 to do so. Sorry to say, but that's just not going to happen. The real sweet spot for graphics is in the $250 to $300 price range.

    I cannot express how frustrating this is. People, please do not spend more than $150 on video card. This is just insane. I guess we do need people like this to keep the graphics market hot by paying $300 for a card. I just hope game manufactures don't think that their games should require $300 cards.
    • What I always say about this: if it costs more than the current game consoles, it's too much.

      Though I guess I might have to change my reasonning soon, seeing Sony and Microsoft appear to be aiming quite high in their next generation...
    • Re:This is insanse (Score:3, Insightful)

      by Buddy_DoQ ( 922706 )
      What else are young gaming geeks going to do with their money? They live at home in mom and dads basement with 100% disposable income; 300 bucks for a new GPU is nothing. It's a hot-rod culture, rather than mustang parts, it's computer parts.
      • Yeah, you're partially correct. The thing is, most games I've seen don't *require* these hugely expensive video cards to play them. They only need them to run in "high detail", with all the "eye candy" options turned on. If you turn all that stuff down, the game will be quite playable on a much less expensive setup.

        But so many gamers can't stand the fact that a game can possibly overwhelm their computer, so they fork over the money to upgrade - and then complain about it.

        Personally, I think the alternati
    • Re:This is insanse (Score:4, Insightful)

      by rovingeyes ( 575063 ) on Monday November 07, 2005 @01:02PM (#13971121)
      I just hope game manufactures don't think that their games should require $300 cards

      Simple - OEM pressure. I can confirm this because I have a friend who works for Microsoft and I asked him why is that every year we are forced to upgrade. Can't you guys do with what is already available? He told me that they can optimize the systems to run far better on existing hardware but the OEMs don't like that. Dell apparently wants users to upgrade every 2 years or so. Bottom line - they don't care about end user. They know that the end user will spend to use the latest and greatest software.

      • But Dell has basically zero bargaining power against Microsoft. What, are they going to sell all of their PCs without Windows on them? They'd go under almost instantly. For consumer PC operating systems, Windows is the only game in town right now. That means Microsoft can do whatever they want and Dell just has to take it.
        • It's a tradeoff. If, for example, Microsoft were to add useful features, improve stability & security, reduce memory and disk footprint, and improve performance, then they would possibly get more money from people upgrading their old computers to the new OS. Right now, people think "upgrading means a new computer--that's too expensive!". If all they had to do to get a better-performing machine was to buy the new version of MS Windows, it would be a smaller sticker shock. And more people would want t
          • Take a look at your argument from Microsoft's perspective. Every new Dell sold (nearly) comes complete with the Microsoft Tax. How many people get a new machine and think hey, I'll just re-use my old XP license on this new machine and save money? Nobody does, and the option isn't even presented to them.

            The more machines Dell sells, the more money both Dell and Microsoft make. Same goes for any other OEM that's selling computers. Microsoft will NEVER improve the OS to the point where it makes old machines ru
    • Re:This is insanse (Score:4, Interesting)

      by CastrTroy ( 595695 ) on Monday November 07, 2005 @01:02PM (#13971128)
      This is especially true when the newest console is only $300. I like PC gaming more than console gaming, but in the last year, i've switched to consoles because its just so much cheaper. In about the time that a console stays around, 3 years, you'll upgrade your video card a couple times, or upgrade it once and spend twice as much. Meaning that just the video card(s), not including all the other upgrades necessary will cost as much as the console. I got tired of trying to keep in my head which video card is good, because there is about 75 models out there, and which one has the proper drivers to support the games I will want to play. Also, what bothers me is that if I upgrade my operating system, my video card which is a few years old might not have supported drivers, or if I buy a new card, it may not work in my older operating system, forcing me to upgrade. I really gave PC gaming a chance, but there's just too much hassle. I'd rather put up with games that don't look quite as good, or maybe are a little less fun to play, for not having to deal with the frustrations of playing games on PC.
      • This is especially true when the newest console is only $300.

        It's especially untrue if you already need a moderatly high performance PC for other things already. If you're going to have the monitor, and the CPU, and the memory already, buying a $250 video card for gaming is $50 cheaper than a $300 console.

        Like I'm one to talk though... I buy the consoles, and the video card. :)
    • Actually, that doesn't bother me at all - if people want to spend $300 on a GPU, more power to them. What bothers me is that the headline presents a $150-$200 GPU as 'affordable'. $200 for a GPU is a lot of money for some people, and this is specially true outside the US where exchange rates and taxes come into play. I was expecting a sub-$100 GPU, a-la-FX5200.

      BTW, i do own a FX5200 and i'm able to play Quake 4 with special effects perfectly fine in it - yes, 640x480, but it still looks and
    • Once again, I need to upgrade my video card (ATI Radeon 9800 Pro AIW; 128 MB) just to play the newest and upcoming games even at 1152x864 resolution with all graphic options to the maximum. I have to do this upgrade every one to two years ever since 3D cards were born (Diamond Monster 3D/Voodoo1 card as my first one)! At the same time, I am stuck with AGP slot on my motherboard since I am not upgrading it any time soon.

      It looks like I am aiming for a GeForce 6800 (128 MB; AGP) to buy in a few weeks. I am no
  • The Irony! (Score:5, Insightful)

    by Zemplar ( 764598 ) on Monday November 07, 2005 @12:55PM (#13971009) Journal
    Design goals:
    1. CPUs: High cost, low power
    2. GPUs: Low cost, high power

    Granted this is a rough approximation, but it seems that GPUs are destined to waste all the power [watts] modern CPUs are saving.
    • Give it time. Remember, graphics co-processors entered the game quite a bit after their general processing counterparts.

      Just as desktop CPUs are leaving the era of High heat, High power, balls to the wall performance busting, GPUs are entering it. I'm sure when people start to realize their 1GHz graphics card has a cooler bigger than their old P4s solid 400g piece of aluminum and a fan louder than a trainwreck the industry will come to its senses.

      And maybe, just maybe I can get a nice, quiet, low powe
    • but it seems that GPUs are destined to waste all the power [watts] modern CPUs are saving

      This is largely because of the completely different design methods and timelines in the two fields.

      CPUs are designed pretty close to the transistor level. They optimize the crap out of them, and try to do the most work with the least transistors. You have a lot of flexibility in changing the die size, the power consumption, and so forth. You can also ramp up the clock speeds to insane levels -- 3-4 GHz currently. This a
    • Yup, all computer equipment now costs far more to operate then to buy. Electricity is not cheap.

      I'm waiting for someone to make a serious (everything I've seen is a toy/junk) Mac mini-like AMD64 box. Apple is going to do that soon, and they are gonna sell a billion of those if they also run Linux and Windows, and there is no reasons they wouldn't.

      I'm just fine with no special effects and 10 FPS. That my PC sounds like a small engine and puts out enough heat that even in the winter I have to open the window
    • Granted this is a rough approximation, but it seems that GPUs are destined to waste all the power [watts] modern CPUs are saving.
      You have a point, although the biggest need for low power CPUs is for laptops (i.e. running on batteries) which you typically wouldn't use to play 3-D games anyway, if they could even contain the high-powered GPUs to which you're referring.
      • "although the biggest need for low power CPUs is for laptops"

        Athough this is indeed a good need for low-power CPUs, they are still only used a fraction of the time and basically only save battery power. However, I believe that lower powered servers, which operate 24/7, would benefit everyone and the environment more even though they are arguably fewer in number.

        For example, Sun is marketing some great low-power [watt] servers with outstanding performance [sun.com]. This is where I see the greatest benefit of r
  • by Work Account ( 900793 ) on Monday November 07, 2005 @12:55PM (#13971011) Journal
    I wish video card makers would be more CLEAR when they decide on names for their cards.

    We are one step away from having "Nvidia Model 8912347892389110".

    For lay men like myself who buy a new video card every few years, it is hard knowing what is what in the video card market since the names are very confusing i.e. 6800 GS vs. X800XL vs. 6800 GT.

    Discuss.
  • by springbox ( 853816 ) on Monday November 07, 2005 @12:58PM (#13971067)
    This is great, but this title seems like an oxymoron at first (NVIDIA = Cheap?) They used to make cheap video cards in the past that were crippled and preformed poorly (the GeForce 4 MX cards.) A good NVIDIA card used to cost 1/2 the price of an affordable computer, around $400. The last time I checked, all the value cards were around this $100 price range. I hope they can actually make something that's cheap and decent.

    You can probably get that previously $400 GeForce 4 card now for around $80. Probably would be more than enough for most people.

  • $250 (Score:5, Insightful)

    by RCVinson ( 582018 ) <RCVinson.gci@net> on Monday November 07, 2005 @12:59PM (#13971082)
    $250 makes for "a new low-cost, high-power graphics processor"?
    • Agreed WTF? (Score:5, Insightful)

      by bogie ( 31020 ) on Monday November 07, 2005 @01:23PM (#13971372) Journal
      $250 is a new breakthrough in affordability?

      I was naively waiting to read about a $100 gpu that performed well enough to play today's games at lcd resolutions.

      When you can build a very fast system with everything sans gpu for $400-$500 spending more than half the system cost on a single component sounds fucking stupid.
      • Now don't believe the hype, games work fine on older cards. I have a 9800 Pro at home, and I haven't yet encountered a game that's a problem. No, you can't crank the resolution and details and such but it plays all games, even new ones, fine.
        • The only game I've seen that eats a baseline card is F.E.A.R. From what I can tell, this has a *lot* to do with possible bad programming, as the graphics compared to many other games I've played (just fine thank you on an FX5200) really do suck even on the higher-end cards.
    • I just hopped over to newegg.com and they were listing the first 6800GS for $209. The lowest priced 6800GT is $269. The lowest priced 256MB 6800 in PCIe is $209 (there are cheaper 128MB cards on AGP, but I wanted to keep the numbers relevant).

      With the performance being nearly identical between the GS and the GT, the result is a 20% drop in the price at this level of performance (or a major boost in performance at the $209 level). Either way, I think it's fair to call it low cost, as long as you qualify t
    • No kidding. Anyone remember when you could get a top of the line Voodoo for $150? How did the normal top price shoot up to $500? And the "low cost" to $250?
  • Comparison / Review (Score:2, Informative)

    by DanteLysin ( 829006 )
    Review of GeForce 6800 GS and ATI Radeon X1600 XT

    http://www.hardocp.com/article.html?art=ODgy [hardocp.com]
  • by Anonymous Coward
    ...because no other standard-model human being would consider a $250 video card to be "affordable". Hint: for non-powergamers (including most geeks) "low cost" GPUs stop in the vicinity of $100.
  • by squoozer ( 730327 ) on Monday November 07, 2005 @01:22PM (#13971361)

    Is there a technolgical reason why multiple GPUs can't be put on a card? I freely admit I know very little about graphics cards but it seems like it might be a cheap way to make a very powerful card. I seem to remember there was a card with two processors on that failed dismally because basically twice the price. What about a card with 4 or 8 cheap processors? Ok the power consumption would be silly but as long as it could be throttled so that when not playing a game only 1 GPU was used it might work. Just thought I'd share that with you all :o)

    • There's a Dual-GPU version of the 6600 available from Gigabyte. The problem mostly comes down to power consumption and heat.

      That's more or less why SLI and X-fire are multiple-card solutions as opposed to expandable single-card solutions - it's that or have a single card with a heatsink so heavy it breaks the PCB.
    • by Jozer99 ( 693146 ) on Monday November 07, 2005 @05:02PM (#13973741)
      Um, been done many times before. Not only do you have SLI, which combines two cards, but there are several "SLI on a single card" monsters with two geForce 6600s or 6800s on a single card. The first dual GPU card was way back in the day, I think it was an ATI Rage. Also, Creative makes high end workstation graphics, and they have a non-SLI dual GPU card. Are you talking dual core? Well, it will probably be done soon enough, the problem is that the software support for multiple GPUs is really crappy (SLI is really not that practical for everyday use). Now, at least with PCIe, the hardware restrictions imposed by AGP are gone. I would expect to see something within six months, probably from SiS. It might take a little while longer for nVidia and ATI to come out with a dual core card, although I'm sure it will perform better.
  • Old Trick (Score:5, Informative)

    by Nom du Keyboard ( 633989 ) on Monday November 07, 2005 @01:24PM (#13971376)
    Once upon a long time ago I worked for Control Data Corporation (anyone remember them?). CDC had a trick, which wasn't new to them, of re-badging essentially the same system with a new model number and a lower price. An example at the time was their popular CDC 3300 mainframe becoming the CDC 3170. The only difference between the models was that the CDC 3300 had a 1.75uS clock, compared to the CDC 3300's 1.25uS clock. Move one wire (the right wire!) inside and the CDC 3170 became the CDC 3300 in all respects except for the name badge on the equipment bays and console.

    Why do this I wondered? The problem was in government contracts. After you'd paid back the design costs addition computers could be pumped out at a cheaper price while still both making a profit and remaining competitive. The fly in this ointment is that the government, who often bought quantities of the earlier models where cost was not the first concern (when has cost ever been a concern to governments spending tax money?). I was told that the government contracts stipulated that if you ever lower the price on something you've sold them you have to rebate them the entire difference on every system delivered. Of course that would bankrupt any company, so they resorted to this rather transparent subterfuge.

    Perhaps some form of that's what's happening here as well.

    • While I don't doubt this is true, I heard a similar story 20 years ago about a company that sold two models of mini-computers. Apparently the only difference between the two models was a wire that had to be clipped to achieve the higher performance.

      The only problem is I've heard the story told about 10 different ways. I'm wondering if it's actually apocrifal?

      • There is plenty of news to back this up. The original 487 was a fullfledged 486 with the FPU enabled, most modern processors are actually made in the same process. The "ideal" few processors are the ones rated for the higher clock rates. [that's not universally true, eventually you need a new design to get higher rates].

        In that case though that's because lithography is not a perfect process and errors [e.g. skew, heat, etc] can make it unstable at higher rates. That's why you'll see "worst case 100C" li
    • Notice that it has less pixel pipes. There are 4 blocks of 4 on a 6800 series chip, and one of those is disabled. However, the chip is clocked faster. My guess is they have found that they are still having a number of chips that one of the four blocks will fail on, espically at higher speeds. Ok so just make a new line of cards that only has three active at a higher speed and sell it. Gamers are happy, and you get to use more of your production capacity.
      • This is an NV42 chip, not NV45. It doesn't have 4 quads, just 3, not to mention it's made on the 110nm process. Nvidia is selling this because a 110nm chip with 3 quads is a good deal cheaper than a 130nm chip with 4(the 6800GT).
  • While I've been enjoying my 6800GT and 7800GT cards, I'm worried by the fact that ATI can't seem to keep up. Ever since they lost the dominance they had aquired with their 9700/9800 series, They've been behind in performance, street dates, availability AND prices. It's already been 2 generations now. Any gamer knows that, today, nVidia reigns supreme.

    I hope that ATi regains the upper hand in the next round because things are looking grim for them. nVidia is a bigger company with bigger coffers and better ma
    • I hope that ATi regains the upper hand in the next round because things are looking grim for them. nVidia is a bigger company with bigger coffers and better marketing skills so they can withstand bad times more easily than ATi. They handled the whole 5700/5800/5900 debacle very well considering ATi's offerings ate them alive back then. God forbid ATi should go bankrupt and we end up with a defacto nVidia monopoly!

      Except ATI is providing chips for console manufacturers, and probably will make plenty of mon

    • I'm worried by the fact that ATI can't seem to keep up. Ever since they lost the dominance they had aquired with their 9700/9800 series, They've been behind in performance, street dates, availability AND prices. It's already been 2 generations now. Any gamer knows that, today, nVidia reigns supreme.

      Two generations - with one generation being six months, one year isn't that much to worry about. Who knows what new DirectX/OpenGL extensions will be invented in the future; superbuffers, real-time ray-tracing, r
    • It seems to me that console chip development must draw budget away from getting the absolute fastest card out in the shortest time, because once a company starts working with consoles, the other guy gets that 1% faster card out a month before you.

      It happened to nVidia and it's happening again to ATI. It probably would have happened to Voodoo too if they hadn't self destructed before console companies realized that they couldn't develop everything in-house anymore.
  • I bought a 6600 PCI-E for 179$. Why did I buy a 6600 PCI-E for 179$?

    It was the cheapest "non-crap" PCI-E from nvidia I could find. And you know what? It plays Far Cry, Thief3, Battlefield2 and the others JUST fine.

    This bullshit article about "needing a 6800GT to enjoy the games" is just that. Bullshit. Sure the game may look shinier at 1600x1200 with 200fps and a billion texels/sec or whatever ... But if that's what it takes to make the game "fun" we're obviously not playing the same games.

    Point is this article is all about selling the latest bullshit cards you don't need. A 6600 will do you just fine if you're an average gamer [e.g. you have REAL work to do the rest of the day], it can play games at 1024 and 1280 reasonable well [very well at the former].

    If you're on a budget and you think you need to spend 250$ USD [keep in mind 179$ I'm talking about is Canadian not USD] to enjoy games ... you need a few moments of education :-)

    This is just a press release disguised on a 30 page article [chalk full of ads no less] to sell the latest and greatest...

    Tom
    • The 6600 isn't a bad card, and if you're on a budget I'd totally recommend it. On the other hand, you can get significantly better performance for more money.

      Now, I'm not saying that these games aren't fun at 1024x768 with dynamic lighting turned off, blob shadows, and "medium" resolution textures, but it's still like the difference between watching a movie on an old television versus seeing it in a theater.

      If you have the money, you can make your games look significantly better for the price of two games.
    • Ah, but enjoyment is relative, isn't it? I average about 30 minutes per day of Battlefield 2, and occasionally will play through another game (I'm on interval 2 of FEAR right now). My previous system was an Athlon XP 2500+ w/ a gig of RAM and a 6600GT AGP. While it ran the game at 1280 (my LCD's native resolution), when involved in firefights my minimum framerate typically dropped to below 10 frames per second. It just got too frustrating for me to be continually killed, not because I lacked the skills (tru
  • by Animats ( 122034 ) on Monday November 07, 2005 @02:22PM (#13972072) Homepage
    GeForce 6800 GS - $249, according to NVidia.
    GeForce 6800 GT - $266, according to PriceGrabber.

    The cheaper model has 12 instead of 16 pixel shaders, and 5 instead of 6 vertex shaders. They probably use the same chip. The benchmarks are close. $17 cheaper. Big deal.

    In terms of price/performance, Via is probably the leader. They've just introduced some new S3 Chrome [techspot.com] boards that are roughly comparable to the GEForce 6800 line, but are priced around $150. That technology will probably be in Via's motherboard chipsets soon, at an even lower price.

  • Okay, I'm not a gamer.. So, this pretty much eliminates me from the target market of any of the new video cards. But, I am willing to pay quite a lot, for a video card that does things I am interested in. Such as:

    - Video acceleration. Full MPEG decoding (not just iDCT+MC offload) for MPEG2, like the Unichrome video chips do. Full H.264 decoding is even more important, given its growing popularity and huge CPU requirements.

    - Open Source drivers, with full functionality. Good Linux support, enab

No spitting on the Bus! Thank you, The Mgt.

Working...