Bitboys Silicon Sighted 148
ZaPhY42 writes: "The Bitboys look like they've actually produced some working silicon of their mythical XBA Xtreme Bandwidth Architecture-based graphics card which they were previewing at Assembly 2002. Photos of the card can be found here(1)
and here(2). What next? Duke Nukem Forever gets released by 3DRealms? ;)"
FPGA? (Score:5, Interesting)
Re:FPGA? (Score:2, Informative)
Re:FPGA? (Score:2)
Re:FPGA? (Score:1, Interesting)
The PDA accelerator was actually quite hot product. It had less than 100kgates and was working with different principle than usual accelerator cards (vector/polygon based).
Yep (Score:1)
Who know if it works well or will actually be fast?
Re:Yep (Score:2)
Besides, if you send the design to Altera, they can manufacture a whole bunch of Asics for you. In theory... whether the resulting ASIC would be fast enough is unclear. (Doubt it.)
hi-res pictures (Score:4, Informative)
Re:hi-res pictures (Score:2, Informative)
Re:hi-res pictures (Score:1)
mirror in progress [pitt.edu]
Video also (Score:1)
Its 108 megs. Not too slow yet
Interesting... (Score:1, Interesting)
Case in point... the new Parhelia by Matrox has seen some overclocking and they have found that performance percentage gain is linear to core clock speed percentage gain. It's still a slow card relative to the current FPS kings.
On a side note, I am a Matrox fan and have owned a G200, G400max, and will be getting my Parhelia soon. I don't game as much and can use three monitors. =)
Re:Interesting... (Score:3, Informative)
If you don't know what you are talking about, then don't talk, OK? The chip has 12 megs of on-die eDRAM on 1024bit memory-bus running at chip speed. That is pretty revolutionary when it comes to PC 3D-accelerator. PS2 has something similar in it's graphics-chip.
as for the 3 monitor thing (Score:1)
i am assuming you use 20/21 inch monitors, and 1280 just seem wasteful on a 21 inch. 2x1600.1200 gives the same pixel count, without giving you as much clutter on the desktop (3 monitors).
so if i eventually gets enough $$ for the 3 monitors, i will probabbly get a good card for primary (so still *can* game) and then two pci cards (they are still around, somewhat) for the other two, so i can drive the resolution to full on each monitor.
just some thoughts i collected during my search for the perfect desktop.
Why this is news (Score:1)
Re:Why this is news (Score:3, Informative)
Because they did not announce any products. They held a seminar and in that seminar they had chips and video.cards on display. Now, in the past they have "pre-announced" products. But that was in the past. They haven't said a thing regadring their products in over 1.5 years! They have been running silent for a long time now!
Re:Why this is news (Score:5, Informative)
Back in 1995, or was it 1996, a small Finnish company by the name of Bitboys Oy announced their Pyramid3D chip. As the months rolled by there emerged successive technology reports that detailed such things as a geometry engine (transform and lighting) and projected speeds well in excess of the first Voodoo card from 3Dfx. What was never seen however was the actual product itself. Were BitBoys for real, or was it all a hoax? As I understand it, BitBoys approached a company by the name of VLSI Solutions with a chip design that later turned out to be written on the back of an envelope. TriTech Microelectronics then purchased that envelope with the intention to design the chip themselves. VLSI would then fabricate the board and another company program the drivers. Eventually some pre-production models were produced and demonstrated at Assembly '97, but just before the cards went into mass production in 1998 Tritech pulled out of the project killing it stone dead.
So it would seem that the Pyramid3D project did eventually become legitimate, even if Bitboys never were. The following left-hand image is purported to be a pre-production version of the card, but the right-hand image has been sent in by Mark Vojkovich who actually owns one today. It has 8MB of SDRAM and a pass-though connection similar to that found on a Voodoo board.
Figure 4.6
In May 1998 BitBoys Oy raised their heads once again and announced their Glaze3D chip. This chip would have a projected performance four times greater than that of the current all-conquering Voodoo2 chipset! At the time this statement caused a considerable stir, especially considering the fiasco surrounding their previous attempt. However once everyone realized that the chip would not be produced for over a year, this interest soon dissipated.
Figure 4.8
At Siggraph99 in August the Bitboys were back. This time with an updated Glaze3D specification that included every feature under the sun, including 9MB of embedded DRAM memory and four pixel pipelines capable of rendering 600Mpixels/sec and 1.2Gtexels/sec. In addition to this, the new chips could be connected in parallel to produce a phenomenal 1.2Gpixel/sec and 2.4Gtexel/sec. Was it merely a coincidence that this specification seemed to mirror many of the forthcoming features from NVIDIA and 3dfx. Would we ever see a Glaze3D chip? Don't hold your breath.
Can you believe it, in January 2000 the Bitboys were back again. Don't these guys know when to stay down! Having shelved their Glaze3D chip (surprise, surprise) they now announced their new XBA(TM) or Xtreme Bandwidth Architecture! Yeah right. By this time few were taking the Bitboys seriously, as demonstrated by the following press releases.
www.3dspotlight.com [accelenation.com]
www.somethingawful.com [accelenation.com]
If the Bitboys had ever done more than just PR this announcement might be cause for caring. You say they did a tech demo, great. Now so what?
Re:Why this is news (Score:1)
If Bitboys had anything more to do with it than sketch an envelope, that's pretty impressive.
When BitBoys' XBA ships... (Score:5, Funny)
Guess you get to decide who's to blame for the holdup.
Re:When BitBoys' XBA ships... (Score:2)
But then again, I remember when I was eagerly awaiting Quake - the platformer starring a dwarf.
Re:When BitBoys' XBA ships... (Score:1)
I don't know about you, but I'd still like to see that game. :-)
It would be nice if iD would start mixing a little intelligence in with the action.
Re:When BitBoys' XBA ships... (Score:1)
Bandwidth Owner? (Score:4, Funny)
Man, he's probably really pissed now.
NOT XBA! Display accelerator for mobile devices (Score:5, Informative)
The demo they showed was indeed an FPGA. It has around 20k-30k gates, and was running at around 25MHz or so. The demonstration animated filled polygons and bezier curves, with various effects such as transparency at around 30-50 fps.
Obviously we are not talking about something that would run Doom 3! Having said that, their solution looked very interesting from a mobile point of view, since it could provide acceleration for UI, SVG and simple games with a very low cost, in terms of gates and power consumption.
Re:NOT XBA! Display accelerator for mobile devices (Score:2, Interesting)
They should be concentrating on getting hardware Transform & Lighting running so as to relieve the relatively slow CPU's on those devices of transforming all of those polygons.
However, some of the photo's referenced in the original article definitely show some kind of hardware driving a PDA-resolution LCD, so I guess that truly is their target environment.
Re:NOT XBA! Display accelerator for mobile devices (Score:1)
If you check the photos, it seems that the had a real PC accelerator on display as well. I missed that at the party, since they didn't demo it or talk about it during the presentation. It may well be the XBA thing. I don't know how complete it is, but it might well be working -- heck, I've even seen Pyramid 3D running Tomb Raider.
The hardware driving the display is the mobile accelerator they demoed. I guess it is probably a standard FPGA test board, which is why it has both a PCI connector and USB interface. I don't recall the display type or resolution (if they even mentioned it), but it is probably something in the range of a Nokia 7650 (176x208) or PocketPC (240x320).
Re:NOT XBA! Display accelerator for mobile devices (Score:4, Insightful)
You shouldn't believe everything you're told. The chip was very clearly marked, it's an Altera APEX EP20K400C [altera.com] PLD. The memory chips on the back are Altera EPC SRAM 'configuration devices' [altera.com]. That means it's got between ~400,000 and 1,051,648 [altera.com] gates, not 20-30k.
While I can't fault them for writing a program that does everything you mentioned for this particular PLD, it's definitely not as impressive as they lead you to believe, and I don't see how this is 'their' silicon at all. The other card, I can't comment on, since it didn't actually do anything. (But we all know that a reputable company like Bitboys is far above faking a demo, right?)
Re:NOT XBA! Display accelerator for mobile devices (Score:4, Informative)
The chip was very clearly marked, it's an Altera APEX EP20K400C ... That means it's got between ~400,000 and 1,051,648 gates, not 20-30k.
Yeah, well, the 20-30k figure was from their presentation (might have been 22k, don't remember exactly). Nothing forces you to use everything on the chip... Obviously we have no way of checking the claims, but I don't think they have a big reason to give misinformation in that area. The people they need to convince are the mobile device manufacturers, not a bunch of demo coders.
There is no "their" silicon for the mobile accelerator in the usual sense of having an NVidia chip. There probably never will be either -- something like this would be integrated into existing silicon in a device, not put into a separate chip. If you open a modern mobile phone, you don't find separate CPUs, DSPs etc, everything that can be intergated is.
Having said that, I wouldn't hold my breath waiting for the first device with this graphics hardware to ship... The Bitboys don't exactly have a stunning track record in that area. :-)
Re:NOT XBA! Display accelerator for mobile devices (Score:4, Informative)
When counting gates, FPGA are inherently less efficient that ASIC or full-custom chip, due to the FPGA's fixed structure. A logic design that may take 400,000 gates in an FPGA may fit into 40,000 ASIC gates. This is normal. The fact that ALTERA calls this device a 400,000 gate device doesn't mean it actually is. This is a hard to measure number, just like performance benchmarks.
FPGA's are usually left at 50%-60% utilization if you want to be able to get any decent speed out of them. if you start filling them, routing becomes harder, and the speed drops.
Remember that this is a general purpose prototyping board. They may use a larger device not because they need it but because it allows them more freedom while designing.
Summary: The fact that they use an FPGA that is characterized by it's manufacturer as a 400,000 gate device doesn't mean their graphics core won't fit into 22,000 ASIC gates.
Re:NOT XBA! Display accelerator for mobile devices (Score:2, Informative)
However, given that, it might be their general-purpose proto board. The graphics core might be 40k (or fewer) equivalent gates, but it looks like they access the devices over a USB interface, and that takes more gates. And, if you're not sure how big your design will be, it never hurts to have too much FPGA.
While I'm nit-picking the boards, the other board on display is a BGA in a friction-mount connector. Looks like they're expecting to replace that chip quite a bit.
And, a final on-topic-like statement. I work on an embedded device. Acceleration of video at any cost in power is worthless. I'd rather flash 5 screen per second at very low power than 50 screens per second at multi-Watt power consumption. (Reference ATI's mobile graphics solution: low power, but still not as low as StrongARM integrated video.)
Re:NOT XBA! Display accelerator for mobile devices (Score:1)
Not to point out the blindingly obvious, but it's quite likely that they're not using :::all::: the gates on the EPK400C. Just because the PLD (PROGRAMABLE Logic Device) has that many gates doesn't mean they're all being used in the bitboy design.
-Chris
Overspecced prototype hardware. (Score:2)
At work I'm dealing with a system that has a few FPGAs and DSPs on it, along with some RF hardware. It currently consumes 6-7 amps at 24 volts. It's expected that by throwing away a lot of our "excess" silicon, we'll be dropping that to an ampere or two, simply because EVERYTHING on that board is massive overkill.
Why the numbers? (Score:1)
Even in a clear case they would be parrallel with the floor/desktop?
Re:Why the numbers? (Score:1)
Re:Why the numbers? (Score:1)
Re:Why the numbers? (Score:1)
It could be pretty cool if you had one of those case windows...
Will they succeed? (Score:2, Interesting)
I think "will they succeed?" is a really interesting question, for this company.
I presume they've still got Psi (Sami Tammilehto). He was the Carmack of the demo scene, an innovator in realtime graphics programming, back in the early/mid '90s.
Finland has proven, with Nokia, that it can compete on a global scale with consumer products. But this startup feels a long way behind Nvidia, ATI and the other established players.
Will their chip be good enough to find people to license it? Will the drivers be good enough to compete with Nvidia? What market will they target (hardcore/mainstream/mobile)?
I think this news raises more questions than it answers, but for love of the Finnish demo crews alone, it's worth keeping an eye on them.
look at the site for pics number 2 (Score:1)
Working strategy (Score:2, Funny)
Re:Working strategy (Score:1)
1. Hype!
2. Hype!
3. Hype!
x-1. Hype!
x. Display non-working prototype running off FPGA
x+1. Hype!
y-1. Hype!
y. ???
y+1. Hype!
n-1. Hype!
n. PROFIT!!!
Re:Working strategy (Score:4, Funny)
- Hype!
- Actually produce a product.
- Profit!
This, for the Bitboys, was the turning point.It's been a long time since I looked at any demos (Score:1)
Anyway, just brought back old memories. Now my chip is up to 59C.:)
J:)
Re:It's been a long time since I looked at any dem (Score:3, Interesting)
Future Crew Timeline [defacto2.net]
And Skaven [futurecrew.com] was even competeing. In fact he won the "Instrumental Music" category with a new version/sequel to his previous winning song "Catch That Goblin".
Anyone interested in MOD/ULT/S3M/IT/XM/669 music from the demo scene should checkout Nectarine Radio [scenemusic.net].
What's with Finland? (Score:1)
Re:What's with Finland? (Score:1)
YAWIAR
Re:It's been a long time since I looked at any dem (Score:1)
The site I linked to in my Grand-parent post also has archives of FC's music. I didn't see any recent demo work from FC - must be too busy with their jobs in the industry...
J:)
BB ... YES! (Score:2)
Or perhaps that Transmeta will couple it to its ulta low power, ultra low performance line of CPU's?
That little LCD display being driven by the Bitboys GPU is nice...only if we want to run in 120 x 70 display mode.
Cut the donkey-puck, BitBoys. Put out the hardware on production level silicon. Until then, we can't take your promises like going to tape out in 1999 for real.
Maybe in another 3 years the tape-out silicon will reach production, until then, what then? Synthetic benches run on an imaginary system looping an imaginary benchmark under synthetic conditions?
I'm a bit afraid to look under their (Score:3, Funny)
Now hiring:
Marketers - We need dedicated people to hype non-existant products that on paper outperform all the competitors, Combined!! We will never have an actual product for the market, but we need skilled marketing professionals to make people think one day we actually will.
Engineering - No Current Openings (and never will be)
The DNF-BitBoys Connection (Score:3, Informative)
When the members of the famous demo group the Future Crew(Think "Second Reality") finally got full-time jobs, there were a couple of shops they went to. First, some of them went to 3D-Realms, which produced Max Payne(Skaven [futurecrew.com] did some of MP's music), and of course, does work on DNF. At the same time, some of the other guys broke off to work at the BitBoys, as they were really more of hardware type. So who knows? It may very well be possible that both sides are holding things up to release together, all because of where they came from.
Re:The DNF-BitBoys Connection (Score:2, Informative)
Future Crew and Sorcerers were competing against against each other. Read the scrollertexts of Future Crew's first demo "YO!" (coded by PSI) - they send greetings to "Sorcerers".
Yahaya (Score:3, Insightful)
The demo they showed was indeed an FPGA. It has around 20k-30k gates, and was running at around 25MHz or so. The demonstration animated filled polygons and bezier curves, with various effects such as transparency at around 30-50 fps.
Yeah, but the demo unit they showed was the relative size of a tank to a Yugo...they want to put THIS into a MOBILE device? Mobile devices come with an ISA slot? Ya, ya, I see how it's all for test and NOT production and all that, but you think that BitBoys would have shown something smaller for the mobile market than something you could barely fit into a standard ATX case!.
Re:Yahaya (Score:2, Informative)
Yeah, but the demo unit they showed was the relative size of a tank to a Yugo...they want to put THIS into a MOBILE device?
You don't put new chips into mobile phones. The accelerator would be integrated to the same silicon with the CPU or the display controller (which might be on the same chip anyway). From that perspective it makes sense to develop and demo it on an FPGA, since it would be licensed as an IPR block anyhow.
Huh? (Score:1)
Re:Huh? (Score:2)
Re:Huh? (Score:2)
Infact I remember reading an old maximumpc article stating that the bitzboy's vodoo2 killer was doing a demo at 25fps while the vodoo2 was doing it at 45fps. Its no longer online so I can not link it since this was way back in 1997. After the hype their demo never performed and never was produced.
I judge them with a grain of salt. I am glad I am not an investor since they actually never made a single sale in 6 years! My guess is either vodoo or Nvidia keep coming out with a supperior technology and they decided not to ship and start over with yet another design.
I am very skeptic if this chip is the killer chip. Mark my words if the geforce5 is better then you can kiss this vaporware goodbye. I am supprised only one
Holy Cow! (Score:2, Funny)
Imagine the number of frames-per-second of ultra-low rez polygons that card can deliver!
Okay, I need some more sleep.
dreamer out...
I was there (Score:5, Interesting)
After the seminar I (and others) managed to talk with them. They had their PC 3D-accelerator on display, along with sample chips. They are pulling out of the PC-business for now in order to focus on the mobile stuff. The chops is called "Axe" and it is working. They are testing it in-house as we speak, and new revision of the chio is coming up. But it will not reach consumers because Infineon is killing the silicon-process at the end of the year. The chip had 12 megs of eDRA and it was somewhat bigger than other chips out there.
You can get the seminar from:
ftp://ftp.asmparty.net/pub/seminars/
It's the one called "Graphics hardware for handheld devices". I'm the guy with the laptop
Correction (was: Re:I was there) (Score:3, Informative)
Obviously, that should be SVG (as in Scalable Vector Graphics)-acceleration.
Still vapour (Score:1)
About Duke... (Score:1)
Apparantly it's almost done... or a scam
Re: (Score:1)
Re: (Score:1)
BS Announcement (Score:2)
Somewhere there are some really stupid venture capitalists funding these guys.
Re:BS Announcement (Score:3, Informative)
You make it sound so simple. But it's not. What BB is doing is not "just widening the memory-bus". They actually move 12 megs of the ram ON THE DIE ITSELF. And that memory is on 1024bit memory-bus. For comparison, that four times as wide as on Radeon 9700 and Matrox Parhelia. That emdedded RAM is used for the things that require most bandwidth, namely the frame-buffer. Textures don't need alot of bandwidth, and they are located in the slower "traditional" RAM. Of course, if there's any eDRAM left, most used textures are stored there.
When it comes to PC 3D-accelerators, that IS pretty damn revolutionery!
Re:BS Announcement - Execution and Elegance. (Score:1)
It's one thing to propose embedded memory in a paper design, and another thing entirely to get this working on silicon that sells. PS2 did this and Sony deserve much credit for delivering product. Execution is what matters in the graphics business. nVIDIA understand this. Architecturally or academically, what also matters, given the ability to execute, is elegance. Tile based rendering is an elegant idea, but it's a pig to execute it. Elegance is nothing without execution, and brute force isn't even elegant.
Embedded video memory is already SHIPPING (Score:2)
It's one thing to propose embedded memory in a paper design, and another thing entirely to get this working on silicon that sells.
A GPU with an on-die frame buffer isn't just vapor on paper. There's one in a video game console from Nintendo called the GameCube. PCs with the GameCube hardware, called Dolphin development kits, are available to a select few.
Re:Embedded video memory is already SHIPPING (Score:1)
sorry for the misunderstanding (Score:1)
You're not trying to impress me with your developer credentials are you?
No, just pointing out an additional example. If both GCN and PS2 do it, and they manage to make good graphics on a budget (a PS2 chipset + a joystick + a DVD-ROM drive + a DVD decoder license < $200), it's only a matter of time before the tech comes to the PC. Expect good things from ATI in the near future.
I'm not even a licensed developer; I'm just a lowly homebrew hacker. Here's what I've done on the GBA [pineight.com].
Re:BS Announcement (Score:1)
emdedded RAM is used for the things that require most bandwidth, namely the frame-buffer. Textures don't need alot of bandwidth, and they are located in the slower "traditional" RAM.
Oh dear. You haven't thought that through, have you? How many texels contribute to a pixel per texture map? How many textures per polygon?
Re:BS Announcement (Score:2)
Answer: The frame-buffer.
Re:BS Announcement (Score:1)
You haven't thought it through yet. Why don't you _quantify_ your assumptions? How many textures per polygon? Don't forget those fancy DX8 pixel shaders while you're there. How many texels are read to generate one textured pixel? Is this a classic Z buffered architecture or Tile Based? Any tricks like Hyper-Z accounted for? Where are your caches for the texture in off-chip RAM? If you haven't got the message yet, it's not to make sweeping generalisations about graphics architectures. Sigh.
Re:BS Announcement (Score:2)
Sure there are all kinds of tricks to reduce the bandwidth eaten by frame-buffer (compressed Z etc.), but you can do the same with textures (compressed textures, anyone?)
And still, even if textures as a whole required more bandwidth than frame-buffer, it doesn't still mean that the most used 12 megs of textures (that much would fit in to the eDRAM) required as much bandwidth as frame-buffer would. So putting frame-buffer in to the eDRAM is the smart thing to do IMO.
Re:BS Announcement (Score:1)
Re:BS Announcement (Score:2)
But I'm sure that you know more about this than people with vast experience regarding 3D (both hardware and software). I mean, BB is using the eDRAM primarily for frame-buffer, not for textures. But like I said, I'm sure you know more about this than they do....
From BB website:
As an example, how much memory bandwidth is required if the 3D-graphics chip renders 600 million pixels, 1.2 Gigatexels/sec using a dual texturing pipeline? Assume 32-bit color and 32-bit floating Z, as both are superior to their 16-bit counterparts.
For each rendered pixel (on average) we read the depth value and write the color and depth value back to the frame buffer. This means that for each pixel we must access 12 bytes of memory. 600M by 12 is 7.2 GB/s. But this is not all, we also have to count the bandwidth required by the video refresh unit, at 1024x768x85 Hz that's 64 MB/s. We also need to read textures and that's 500 MB/s to 2 Gigabytes/sec. In total, close to the 10 GB/s memory bandwidth and that's just for 600M pixels/ 1.2 Gigatexels
Re:BS Announcement (Score:1)
There isn't -proof- one way or the other, there are only assumptions, specific design decisions and specific implementations. Don't forget in many cases, the closest thing to proof available usually means breaking an NDA and I'm not falling for that trick. Of course I don't know more than people with vast experience, but when you've been doing 3D hardware and software since 1985 like I have (on and off with about 50% duty cycle) you're allowed to argue a few points on Slashdot. I promise not to reply if you want the last word.
Re:Innovation?!! (Score:1)
Re:BS Announcement (Score:2)
ATI did that for the chip inside the Nintendo gamecube, so that isn't all that impressive. Considering that the top Nvidia and ATI top end GPU's are some of the most complex silicon on the planet this wouldn't that hard for another manufacture to do. 1024 bit buses have been used in supercomputing circles before, this isn't an accomplishment like Tile-based rendering was.
Re:BS Announcement (Score:2)
Doesn't matter if it's a PC. See Yoshi's Boxx. (Score:1)
Last time I checked, GameCube is no PC.
No, but Dolphin is. Dolphin is the workstation that GameCube developers use.
What's the difference between a PC with a video card and a PC with a console on a card [techtv.com]?
Release DNF? Ha! (Score:2)
Better card from ATI (Score:1, Funny)
BitGirls (Score:2)
Keep browsing if the first one you try is a bit cheezy. Some of the computer work is quite impressive. [griots.co.jp]
Just think, in a couple of years the BitBoys may be able to render the BitGirls in realtime
-
20 GB/sec? (Score:2)
Re:20 GB/sec? (Score:1)
"Late" is not strong enough a word. (Score:2)
This is a whole different realm of late, approachable only by the likes of HURD and possibly Duke Nukem as mentioned earlier.
Call me when they have running silicon (Score:1)
Even if they did ship XBA.... (Score:1)
Batboy! (Score:1)
Of course, when I read this headline I immediately thought of our pint-sized West Virginian friend [weeklyworldnews.com].
Duke Nukem ForNever (Score:2)
People kind of chuckle about Duke Nukem Forever, but I mean think about it; surely Duke Nukem forever is the worst case scenario in software project management 101.
32bits colors, bahhh (Score:2)
Anyways the point is, they are talking about 20Gb/sec bandwidth.. comparing themselves to a Radeon 8500. They aren't shipping yet, Radeon 9700 is shipping, has about the same specs, has a brand recognition, has more bitdepth, Matrox has more features and bitdepth, Nividia will probably ship their before bitboys even start sampling... and they will support HDRI as well.
So what's the point? they got a proof of concept on an Altera FPGA running, good for them, any new technology is welcomed and I usually appreciate it, but in their case, they made so much vapor in the last years that they've lost all respect and credibility to the few of us still interrested in their stories. If they demo something extraordinary, I'll be impressed. I'd say evolutionnary could be a better expectation.
Re:Women & OSS: The Frightening Similarities (Score:1, Informative)
Oh, waitaminute... ;)
Re:Women & OSS: The Frightening Similarities (Score:1)
Re:This is a good time to pimp DemoDVD (Score:1)
Re:This is a good time to pimp DemoDVD (Score:1)
Re:this is last weeks news, and obviously BS (Score:2)
I remember when people talked about NVIDIA like that.... "Who cares about NVIDIA? 3Dfx will own them!"
Re:this is last weeks news, and obviously BS (Score:1)
Re:always recycle ... to the extreme! (Score:1)
I suppose next you'll tell me that the XFL isn't coming back next season.
Oh, wait... oops.