S3 Graphics Comes out of Hiding with Chrome20 275
Steve from Hexus writes "S3 Graphics, having been quiet for a while, has today announced a new graphics solution, Chrome20, with which they intend to take some market share away from ATI and Nvidia. From the article: 'We were offered a chance for some hands on play with a mid-range Chrome20 series desktop board - the machine was loaded with over 40 top games. A quick run of Half Life2 , Far Cry , Halo and a couple of other titles demonstrated that S3G's new 90nm mainstream card was working without any visual problems and with very playable frame rates.'"
Sweeet! (Score:4, Funny)
{blink}
Re:Sweeet! (Score:2)
It's a new graphics solution.
You must be buzzword compliant.
The Obligatory Question (Score:5, Interesting)
Re:The Obligatory Question (Score:3, Interesting)
Maybe working more closely with the kernel developers, releasing the driver module as source code with the main kernel download, so it works out of the box.
Re:The Obligatory Question (Score:2)
Re:The Obligatory Question (Score:2)
--jeff++
Re:The Obligatory Question (Score:2)
Re:The Obligatory Question (Score:3, Interesting)
Hear that, S3? I know you people read Slashdot.
Obligatory Observation: SGI (Score:2, Interesting)
Back in 1995, SGI should have dumped its proprietary hardware: specialized graphics chips and MIPS. SGI should have created the following dream box: Linux + ARM + commodity graphics chips from NVIDIA, S3, Chromatics, etc.
The special sauce that greases every component is OpenGL. SGI should have leveraged its software technology and dominated the graphics market for decades to come.
Yet, no one at SGI listened.
The critics warned that x86-plus-commodi
Re:Obligatory Observation: SGI (Score:2)
Then they started using nVidia cards in their low end systems. So, in a way, they adopted commoditized hardware.
They have low end Itanium and possibly Xeon systems with nVidia graphics and SGI boards, bandwidth, etc. And MIPS workstations with nVidia graphics or possibly custom SGI stuff. And they still got their high-end which the commodity market can't touch because nVidia has no interest in building industri
If you mean like ATI's I'll stick with Nvidia.. (Score:3, Insightful)
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:4, Insightful)
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:2)
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:2)
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:2)
Of course, the poster in question might have had no idea what he was talking about, but no one was queueing up to toast him either. Mind you, this wasn't on /.
It also doesn't mean that you mightn't have a card that's new enough to need to binaries, but old enough to break, I suppose.
The thing that's
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:2)
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:2)
That DEFINITELY qualifies as an old card.
And, yes, I used nvidia, not nv. nv doesn't have 3D acceleration, even on those old cards, and the system this was in only had a 233MHz Pentium MMX.
I agree (Score:2)
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:5, Insightful)
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:2, Interesting)
Liberty is the government not making laws placing restrictions upon you. It's letting you do what you want to without threat of force against you. Trading freedom for temporary security would be forever giving up some freedoms to be safer for the moment.
NVIDIA is a company; th
Re:If you mean like ATI's I'll stick with Nvidia.. (Score:2, Insightful)
How can you have solid support when only one company can maintain the driver for all GNU/Linux versions you may be running in the future?
Go away, ye false pragmatist.
Thank goodness... (Score:2)
Re: The Obligatory Question (Score:3, Insightful)
Re:The Obligatory Question (Score:2)
Re:The Obligatory Question (Score:2)
Re:The Obligatory Question (Score:2)
A tiny market, but a loyal one? (Score:5, Insightful)
Apple users are a small market, but they're incredibly loyal. Why wouldn't S3 get in on that action?
--grendel drago
Re:A tiny market, but a loyal one? (Score:5, Insightful)
Several lawsuits, as technology used in writing those drivers is patented, and they've likely cross-licensed the patents to even be able to write a modern 3-d driver.
now you could strip all the patented code, and fix it into a working driver, and provide source for it, but ATI already has been doing that for years, yet all I see from the
So I hope this answers your question, as to why they cannot do what you seem to think would be so easy. And hey, even if patents were a non issue, the drivers would still be a 'trade' secret, giving that away to your competetors for free means that they will always know how to make there product perform better than yours.
Re:A tiny market, but a loyal one? (Score:5, Funny)
This is S3. I thought competitors *already* know how to make products better than theirs?
Re:A tiny market, but a loyal one? (Score:2)
giving that away to your competetors for free means that they will always know how to make there product perform better than yours.
Not necessarily. In reality, I think the ramifications of opening a graphics driver are analogous to a VCR company telling people where the buttons for play, fast forward, fast backward, etc... are; i.e., no useful competitive info at all.
Re:A tiny market, but a loyal one? (Score:2, Interesting)
totally true (Score:2)
I frequently see posts like the Grandparent asking why hardware vendors don't open up their video card drivers. The reality is these are HARDWARE vendors. They have outsourced much of the SOFTWARE development of drivers to third-party companies that have strict licensing requirements about how their code is going to be used. It isn't even so much about "know how to make their product perform better than yours" as it is keeping their lines of code in-house and private so they can get a contract to do anothe
Re:A tiny market, but a loyal one? (Score:4, Informative)
"now you could strip all the patented code, and fix it into a working driver, and provide source for it, but ATI already has been doing that for years, yet all I see from the
Correction: there is an open source radeon driver that only supports 3D acceleration for cards up to and including the 9200 models. Newer models are only have 3D acceleration with the closed source 3D driver.
Up until ATI stopped releasing 3D programming information to the community, ATI-based cards were all I bought and recommended. The reason is pragmatic: I didn't have to worry about the card working with a new kernel version or the latest -mm patchset. This was my choice, in _spite_ of occasionally incomplete GL implementations (I seem to remember problems with Scorched3D on my radeon).
The last ATI card I bought was a 9200. Now, I buy nvidia. I may be stuck with a closed source driver, but at least it is a _good_ closed source driver. The latest version can do 3D acceleration over multiple cards (xinerama) if all GPUs are similar, which makes for a stunning game of quake on my triple-head system.
If S3 came up with an open source driver that was included in the kernel sources and a marginally competent 3D implementation, I would use them for future purchases in a heartbeat.
Re:A tiny market, but a loyal one? (Score:5, Insightful)
Ok, so let's assume you're right and the technology is patented. So what? This means that there are NO secrets allowed by the government in this product. The whole point of getting a patent is that you have to disclose your invention fully in order to obtain legal protection for it. If I want to see this patented technology, I can just look it up at www.uspto.gov. So this cross-licensed patents argument is a pile of BS.
Strip the patented code... why? Again, if it's patented, there's no secrets. Now maybe the companies holding the patents won't license them in such a way as to allow open-sourced drivers, but this is a licensing issue, not a patent one.
Trade secret: well, are they patented or aren't they? You can't have a trade secret on something that's patented. The two are mutually exclusive.
You might want to learn about the various IP protections and how they differ before running your mouth.
Re:A tiny market, but a loyal one? (Score:3, Funny)
Re:The Obligatory Question (Score:3, Interesting)
Well after visiting their web site [s3graphics.com] and not finding any Linux drivers for their existing cards, and not even any mention of Linux nywhere on their site, I wouldn't hold my breath.
Re:The Obligatory Question (Score:5, Informative)
Regards,
Steve
Re:The Obligatory Question (Score:2)
W3Schools has Windows, all flavors, at 90%, Mac and Linux at 3% each. OS Platform Stats [w3schools.com]. (august 2005) XP's share has grown over 30% since the spring of 2003, Linux only 1%.
Desktop numbers are highly biased againt linux simply because a) Most linux machines were previously windows, and b) Windows machines tend to be replaced more often, i.e. if i buy a windows computer today and another one in 2 years, both w
Re:The Obligatory Question (Score:2)
Don't ya love free advertising... (Score:5, Funny)
Yeah but.... (Score:2, Funny)
Re:Yeah but.... (Score:2)
Re:Yeah but.... (Score:3, Informative)
Re:Yeah but.... (Score:3, Interesting)
Re:Yeah but.... (Score:2)
CS Majors learn problem solving using computer programs.
I am a CS Major. NO I will not fix your PC.
Re:Yeah but.... (Score:2)
Solution, or a card? (Score:5, Insightful)
Re:Solution, or a card? (Score:2)
Don't you mean ... (Score:2)
Even more to come ... (Score:4, Funny)
I hope it's bundled with PowerPoint. (Score:4, Funny)
Re:I hope it's bundled with PowerPoint. (Score:2)
Re:I hope it's bundled with PowerPoint. (Score:5, Interesting)
The card in the machine was a 2MB Virge. Things I found out about the card over the next few minutes included:
1) It supported no resolution higher than 1024x768 60hz 16-bit color.
2) The output looked so bad even on 2D that looking at the monitor hurt my eyes.
3) The instant I dragged any 3D game window, even older ones, to the monitor with the Virge card, they started going at about 10 frames... per minute.
The Virge was the worst graphics card I have ever used. A while back I even tried to run Homeworld on it (as a primary card). Lowest detail levels--check. Lowest resolution--check. Lowest memory allocation--check. End result: D3D hardware acceleration mode goes slower than software mode, at about 2 frames per minute.
Re:I hope it's bundled with PowerPoint. (Score:2)
The ViRGE was never marketed as a card for running OpenGL and Direct3D games. The OpenGL implementation was faster than software OpenGL on contemporary hardware - my Vi
Re:I hope it's bundled with PowerPoint. (Score:2)
Yep, that version of Terminal Velocity actually came bundled with the card.. along with optimized versions of Descent and Tomb Raider, IIRC. I remember being in awe of how nice and smooth they looked (compared to software rendering).
It was definitely not a OpenGL card, it required the (DOS) games be specifically tailored to use it.
There should be some kind of award for that... (Score:3, Interesting)
The Virge was definitely a dog back in its day, probably even worse than an ATI Rage II, but I would be impressed if any of its better-performing contemporaries (e.g, Rendition or Mystique) would be capable of that feat... I just did a search, and couldn't even find any evidence
Coming out with (Score:5, Funny)
Re:Coming out with (Score:2)
"Playable framerates" (Score:4, Insightful)
Read: Nowhere near the performance of ATI/NVIDIA.
Unless they plan on taking over the integrated graphics, $300 PC market, why bother?
Re:"Playable framerates" (Score:2, Insightful)
The point is competition. Far too long have we been stuck in a dichotomy of two-superpowers.
But, this isn't their first try, either. The S3 Delta Chrome was just average at release, and even segmented off into integrated graphics by a few VIA chipsets.
Trident tried to dive back into the graphics realm. Their card didn't go up to the hype (m
Re:"Playable framerates" (Score:3, Informative)
Correction: Read "nowhere near the performance of ATI/NVidia's top-end models".
Why do NVidia bother selling the GeForce FX 5200 any more? It's crap compared to a 7800 GTX!
Oh, wait, it's because they can make a lot of money by capturing the low end of the market as well as the handful of geeks who are anal enough about frame rates to spend more on a single graphics card than the average person spends on a complete computer. Hey, you reckon S3 might just be p
Re:"Playable framerates" (Score:2)
The old Matrox yes. Saddly they do not release specs for their new cards, so they don't get good open source support. They might work fine on Microsoft Windows, but who runs that? (Not me anyway)
Though you will have to kill me to get my Matrox Millinum II from my FreeBSD box.
S3 video (Score:2, Insightful)
who are they kidding? (Score:2)
Playable Frame Rates* (Score:4, Funny)
Yet more magic pixie dust... (Score:5, Insightful)
Step 1: S3 introduces a new graphics card. The name is similar to one they've previously made, but you've never seen that card before because no-one wants to produce and sell one. Specs seem similar too. As usual, it's supposed to be a mid-level card that won't "take on the big boys" but is supposed to have mainstream performance.
Step 2: Hardware review sites get a prototype board. They either experience a number of driver glitches, or performance that is vanilla enough that no-one is all that excited.
Step 4:Joe Gamer reads the review, and buys a tried-and-true midrange solution from ATI or nVidia that doesn't have the driver issues S3 was famous for in cards that actually made it out the door.
Step 5: S3 has teething troubles with the GPU, or the drivers, or production, delaying the chip's release until its performance is at the low-end, yet priced $20-40 above others' low-end cards.
Step 6: The lackluster performance of the GPU relegates it to boards made by one dinky little vendor nobody has heard of and doesn't trust, with nonexistent support. S3 has to lower their prices on the GPU to get any sales at all.
Step 7: S3 doesn't profit.
I'm just curious...how does S3 manage to keep their graphics card business afloat? Aside from a few integrated solutions on VIA chipset mainboards, I can't see any products they manage to make money on.
Re:Yet more magic pixie dust... (Score:2)
1. Pick two machines, one Unichrome, one Intel Extreme
2. Boot up Linux distribution
3. Play with OpenGL screen saver
4. ?!?!? wtf
5. Realize that Intel 'Extreme' kicks the ass of the VIA chip, seven ways til Thursday.
6. Buy a cheapo FX5200 card and enjoy the 3D goodness.
HDMI? (Score:4, Interesting)
Personally, I'm getting beyond tired of technology companies who, some singularly and definitely collectively, make more money than Holly-hood, err, Hollywood bending over backward to placate them. Yes, I know that the studios/**AA control the media/content for the most part but if the _major_ technology players stand up and say "Well, we control the technology everyone uses to your content and there is no other tech company(ies) large enough to challenge all of us so THIS is how we're going to play ball." then WTF would Hollywood do except try to get more laws passed? Then all the technology companies that opposed Hollywood could band together to fight that off as well - dollar for dollar and then some. What would happen to the products that those companies that stood up to Hollywood do - especially when the tech-oriented crowd started praising them to friends/family/etc? Sell multiple, multiples of items that are free of DRM and friendly to the CONSUMER? Wow, what a frigging concept! Make products friendly towards the consumer, don't treat them like a dollar with a body attached, treat fair use rights as they should be treated, don't treat the customer like a criminal from the get-go, tell the **AAs to fuck off and fight piracy where it counts (you know, those media distributors in Hong Kong, Singapore, China, Russia, etc), and make millions upon millions of dollars.
Whew, I've had a very long day.. I think I need lots of sleep now. Sorry for the rant.
Re:Mod Parent Up HDMI? (Score:2)
S3 is back? Oh no! (Score:2, Funny)
Re:S3 is back? Oh no! (Score:2)
GP2 (Score:5, Interesting)
I've been waiting to see "coprocessor" PCI cards become popular, especially among gamers. I remember when we could buy "math coprocessors" to augment relatively slow/cheap math onboard the x86. That was before CPU manufacturing/marketing economics selected for all CPUs to have fast math sections, but with cheaper ones leaving the circuit lines "cut" to the fast part. Maybe that marketing hustle has inhibited the addition of "redundant" coprocessor chips.
GPUs are really just fast math coprocessors, optimized for graphics math and fitted with video coder chips. Gamers are the primary performancemongers and live at the bleeding edge of cranking performance. So they're the natural demanding market for pulling GPGPU products across the bleeding edge into mainstream architectures. Especially since GPGPUs aren't "Central", they're more likely to be "stackable", scalable processing units dynamically allocable for whatever's found at boot.
What we really need are GPUs that have "public" interfaces, either HW or SW (open drivers) that others can harness for GPGPU. Let's see if that kind of competition expands the market for these GPUs, instead of just fighting ATI and nVidia for the current market.
Re:GP2 (Score:4, Interesting)
OpenGL is a 'public' interface that effectively hides the hardware with a standard API while also offering low level programmability via it's shader language. We already have what you're asking for.
Check out the GPGPU [gpgpu.org] project. It sounds like it might interest you.
Re:GP2 (Score:2)
The GPGPU project is exactly what I'm talking about. I'd really like S3 to check it out more, and prioritize its projects for support. Then they might expand their graphics market into that served by GPGPU functionality.
Re:GP2 (Score:2)
I've been checking out the Sh [libsh.org] project and it's a lot of fun. I agree; I'd like to see other vendors embrace the GPU as an 'it isn't just for graphics anymore' processor.
Re:GP2 (Score:2)
I'm intrigued by the Sony/IBM Cell architecture. Am I correct in believing they're using the Cell as both CPU and GPU, perhaps dynamically allocated per-process, or even per-boot? They might have bridged the gap between GPU and (2D/3D) DSP, for a real GP
Re:GP2 (Score:2)
Re:GP2 (Score:2)
Re:GP2 (Score:2)
No, seriously...I've been looking into building a poor man's cluster to play with and distributing Sh code to the various nodes.
Re:GP2 (Score:2)
Re:GP2 (Score:2)
I doubt the PCI bus has the required bandwidth to contribute anything worthwhile to the GFX card. SLI has taken the niche of producing twice the computing power. Not to mention, have you looked at the 7800 GTX SLI benchmarks? At 1600x1200 they're already overkill, they easily drive a 2048x1536 monitor too, providing you can find one. The only monitor that could possibly strain them is Apple's biggest monitor. Gamers don
Re:GP2 (Score:2)
Um... (Score:3, Interesting)
Average users don't tend to replace their cards very often. If they do, they'll go with a 6-month-old card from a major player, not a formerly-OK company that basically seems to be saying "Look at us! We're as good as anything else! w00t!"* And until computers run on $3/gallon gasoline, I don't think "lower power consumption" is going to move a lot of cards.
As for "better performance" when it comes to HDTV... huh? Lots of rigs today can play HD video just fine, and unlike games, video does not benefit much from an ability to show more FPS--once you get past 30, you're pretty much done. Besides, video playback--a series of raster images--has not been much of a problem for years now. It's rendering polygons that's hard.
Sorry, S3, but I don't think this will do much for you.
* except for the fact that it's not actually shipping yet, and those other cards have had drivers out for years, and games are already optimized for them, and...
Power Consumption (Score:2)
I've got a shuttle XPC sitting next to my monitor with a GF6600GT sounding like a vacuum cleaner. I'd buy anything with comparable performance for $200 if it didn't have a fan or any funny "2 slot heatpipe to the back blocking PCI cooler. That said, I don't think my card is available yet. Nor am I a large market.
The big guys have given up on fanless cards. If S3 says they're low power, I hope they don't need one. Fanless actually i
I hope they're successfull (Score:3, Insightful)
My Take (Score:4, Interesting)
The way I read this is yet another small player wants to run with the big boys. What makes this one different? Well they admit up front that they can't compete in the high end so they will target the low end. Is this going to make a difference? I highly doubt it. I predict a flop.
I'm not trying to be too harsh. I'm just stating it like I see it. Personally I'd like to see another player in this market, but I doubt it will ever happen unless someone like Intel decides to make high end graphics cards. Both ATI and NVIDIA spend hundreds of millions of dollars a year on R&D to make their high end cards and all that R&D is applicable to the lower end discrete cards. The lower end cards now days use most of the great ideas we've come up with for the high end cards, but we just do fewer pixels in parallel thus using fewer transistors. Our lower end cards are also fairly power effience even though this article didn't mention it (almost like want people to assume our low end cards use 100W just like our high end cards do). Unless another company spends that kind of money I doubt they'll compete. I'm not saying it's impossible, just unlikely.
I think the graphics industry is becoming less and less likely to have a major revolution (i.e. to something other than triangle based rendering); which would make it much easier for a new player to get into the market. Graphics for the PC with all its legacy software is becoming more like the irreplaceable x86 platform everyday. If we do change to something completely different it will probably come to a console first, but the longer we go on optimizing algorithms and hardware for these triangle based systems the more unlikely such a revolution will come.
Most people who understand CPU architecture will tell you x86 is an old inefficient design, but Intel and AMD have spent so much time/money optimizing it that nobody can seem to come up with a new general purpose CPU that is better. I think the same thing is happening with graphics. The weird coincidence is that both of these fields have 2 major players...
Re:My Take (Score:2)
Re:My Take (Score:5, Informative)
Actually, x86 is a very inefficient instruction set. However, the efficiency of the instruction set has been sidestepped mostly by on-the-fly hardware translation to a more efficient instruction set, large virtual register sets, out-of-order execution, and speculative execution. Neither AMD nor Intel CPU's operate on the x86 instruction set internally. Both of them translate x86 instructions into micro-ops internally and execute those instead -- believe it or not, they're doing in hardware much of what Transmeta was doing in software. The Pentium 4 doesn't even have a true L1 cache for instructions but rather uses an "execution trace cache" which has pre-translated micro-ops.
Furthermore, it's a chicken-and-egg problem when it comes to CPUs. A lot of optimization for X86 occurs because of the vast amount of software (Windows, etc) that runs only on X86. This software is often less than efficient and the manufacturers (Intel and AMD) optimize for the software inefficiencies with things like branch prediction, dynamic fetching, out-of-order execution, etc. Unfortunately, the optimization units to deal with x86 inefficiencies end up costing nearly as many transistors as the units that actually do the work. Other architectures that are more efficient or ship less volume will get less optimization simply because there isn't a reason to throw more $$$ at these optimization units if the core architecture and Instruction Set (IS) are already efficient.
Video cards are not bound to a particular architecture. You can have a radically different video card programmed with a similar API (Direct X or Open GL). Perhaps this can be considered similar to the CPU markets where AMD and INTEL have different internal micro-architectures that interpret and execute the same API (of x86 instructions). However, if one architecture is much less effecient than another, it's easier to switch to the more efficient architecture with an intervening well-designed software abstraction layer in-between (DirectX/OpenGL) than to do the hardware-level translation (x86 procs). Video cards don't have to worry about the software compatibility as long as they can support a minimum number of DirectX/OpenGL features. And it seems like add-on (PCIx/AGP/etc) video cards *ALWAYS* have to worry about performance and price more than CPU's. There's a market for slower cheaper CPU's like the Semprom and Celeron but the only market for cheap video cards is in the MB/integrated category. People aren't going to get excited about an add-on video card that's slow.
Re:My Take (Score:2)
S3 is a good option for HDTV (MythTV) playback. (Score:4, Informative)
Even on powerful systems, decoding and displaying HDTV content can be tough. The current S3 "Unichrome" integrated video processors include MPEG decoding capabilities. This goes well beyond MPEG acceleration in XvMC / DxVA.. It does most of the MPEG processing in hardware, rather than only the iDCT/MC.
Hopefully these new cards will continue to support MPEG decoding.. If so, I'll buy one & ditch my Nvidia with their closed source binary drivers.
But, I would need to understand a few issues before taking the plunge:
- Are the specs & source code for the card fully open? (VIA / S3 have had some issues on this front in the past).
- Are these cards available for purchase? The S3 DeltaChrome & GammaChrome cards were not available as far as I could tell. Only the unichrome was available, as an integrated video option on VIA motherboards.
- Does it have full MPEG2 decoding support?
- Does it have MPEG4 accel support? How about MPEG4.10 / AVC accel (or full decoding)?
S3's real market is in integrated chipsets (Score:3, Interesting)
That's probably the future. The plug-in graphics card is rapidly headed for the same fate as the plug-in math coprocessor chip, the plug-in MMU chip, the plug-in DMA controller chip, the plug-in serial port board, the plug-in network adapter, and the plug-in disk controller.
Re:S3 dear god (Score:4, Funny)
AC - meet Mr Period (.) and his friend Mr Comma (,). They make writing fun! They have a cousin you know - She's called Miss Dictionary. All of these fun people are here to help you be understood. Enjoy them, embrace them and above all use them.
If you don't, you'll give people the impression that you are a dribbling fool who married his sister by mistake.
Re:S3 dear god (Score:5, Funny)
Now, now, let's not be too harsh. I'm sure he married her on purpose.
Re:S3 dear god (Score:2, Funny)
Does this mean that S3 = - Cyrix 20?
Re:S3 dear god (Score:2)
S3 got a bad name, but I actually never quite saw why. Same with Cyrix. Cyrix processors ran very comparable to Intel and AMD, and there really were no major 3D accelerators at the time.
I liked my Cyrix + S3 setup, and will gladly defend it.
Re:S3 dear god (Score:2)
Quite right. I'd put my old Am486 DX4-100 up against any of those Cyrix piles of trash any day. I was able to play games like Quake 2 and Need for Speed 2 - games which supposedly required a Pentium 90 or better. That machine was great...
Hardware Hell (Score:4, Funny)
S3 Virge
VIA KT chipsets
Creative Labs 3DO Blaster
Iomega ZIP
Iomega Buzz
IBM Deskstar
Tandy CDR-1000
HP 5L
Cyrix 386 to 486 CPU Doubler
Anything Belkin
Re:Hardware Hell (Score:2)
2. The Deathstar term appeared before the sale (actually I figure that's why they sold it, because they couldn't make a quality product).
Back in 2000, while IBM still owned the business, a heard from a reseller that they had about 30% failure rate (in the first year) for the Deathstar series.
Actually I had one myself (1.7GB or something like that) - it died an untimely death.
3. I think most Thinkpads weren't manufactured by IBM anyway and I don't think the new guys wi
Re:S3 dear god (Score:2)
they have great video acceleration and are cheap to buy.
they consistently beat or perform as well as high end ati and nvidia hardware in the video arena.
you can still use them to play slightly older 3d games at decent speed but you and i both know they don't compare to the mid or high end from the 2 behemoths of the 3d industry.
if there are open source drivers then they'd make great GNU/linux cards.
Re:How Much (Score:3, Insightful)
Unless they're relabeling a Virge, in which case we're all obviously in Hell.