Nvidia Talks About Next-Gen Geforce, Plus Pics 375
Per Hansson writes "Techspot was at Comdex in Sweden a few days ago; we have now posted a small interview with Nvidia along with some high-res pictures of the Geforce FX on this page in our new comments system." This is one of the strangest looking video cards I've ever seen (and it isn't cheap), though it may look different by the time you can buy it in a box. Which is not yet, despite all the hype.
For who? (Score:4, Interesting)
Re:For who? (Score:2, Interesting)
Re:For who? (Score:5, Insightful)
The first reason is that the first PCI slot tends to conflict with the AGP slot in terms of resource managment. This may no longer be a problem, but old habits die hard.
The second reason is the damn heat-sink and fan is on the bottom of the card. I'll never figure this one out, but why did the hardware enginers do this? The heat from the heatsink rises back into the card and makes the ambient temp even hotter. Most people leave PCI 1 open to help dissapate this heat.
A third reason is that most people are not going to fill their slots anyway. Good mobos today have good sound, 10/100 NIC, and USB2 onboard. Add a good video card, and the rest of your slots are pretty much empty. Even if you add another card, just follow the urinal code. Never place 2 cards too close for comfort.
In short, the 2 card rule has been the de-facto standard for years now, why shouldn't nvidia embrace it for their own purpose?
Re:For who? (Score:3, Interesting)
Hot air rises. Heat radiates outward.
Ie: The efficiency of a heatsink is not altered by it's orientation.
"But the hot air gets stuck under the card!"
Unless the temperature of the air contained within your case varies significantly (which it doesn't with a normal case with a couple of fans sucking air through it), orientation of the heatsink/fan does not matter. Your case doesn't have a mini atmosphere inside of it with updrafts and downdrafts.
Re:Using 2 Slots (Score:2)
Re:Using 2 Slots (Score:2)
First, I've never had a PCI video card complain/care about which slot it's in (at least not in the last 7 or 8 years). Second, I'm sure the FX supports dual outputs like the ti4600 I'm currently using on two 17" monitors does (that's the one thing that sold me; they don't readily advertise this, but it'll drive two at once easily, both accelerated, and either one can run your full-screen games).
Connecting two analog monitors simply involves getting an adaptor, and the digital output becomes a second analog one. Takes some trickery in the driver to get it to function "properly" (so Windows sees it as two separate cards -- the only option that I can stand), but I've had no problems since ditching my two Voodoo3's for the ti4600.
Anyway, I don't like the fan setup myself either. But where I used to care about precious PCI slots, I don't anymore. Between on-board components, dual-output AGP cards, and USB-this and USB-that, I have some 4 PCI slots available in my main PC... so I'm willing to live with the two-slot deal.
Re:Using 2 Slots (Score:4, Informative)
Starting at $150 you can get NVidia Geforce cards that natively support dual monitor, even if they have the DVI output on the back. You just need an adapter to go from DVI to Analog.
I am running a Geforce 4 TI 4600 right now with dual monitors at 1600 by 1200, works great. Before I was using an Xtasy Geforce 4 MX that had two analog ports, it worked great as well. Get one of those cards, plug both your monitors into them, and you won't regret it. As a bonus, keep your PCI card and you can plug a 3rd monitor in. I have a friend that's doing that today. He seriously has 3 monitors hooked up that way.
Re:Using 2 Slots (Score:2)
This is not true. Usually the first PCI slot is sharing resources with the AGP slot - thats why the lost slot doesnt matter much.
Technically speaking you can put a PCI Gfx card into every PCI slot, they are all the same.
(You just have to watch out for shared irq's etc - but thats true for the first slot too)
Clearly a first-gen sample (Score:5, Interesting)
Still, I want one. Now.
Re:Clearly a first-gen sample (Score:5, Insightful)
Re:Clearly a first-gen sample (Score:3, Insightful)
Re:Clearly a first-gen sample (Score:4, Insightful)
Re:Clearly a first-gen sample (Score:2)
That would certainly look funky. Then again, I still think of PCI/AGP cards as "upside-down"...
I believe that extending the size to that side of the card would be considered "out of spec", and some motherboards would have a problem with that. My Aptiva board for example has the CPU clip/thing (Slot-1) very close to the AGP card, so in that box at least this wouldn't work.
Re:Clearly a first-gen sample (Score:3, Insightful)
Considering that even good motherboards barely break the $150 mark, while high-end GPUs can be $400+, it doesn't make much sense to make the GPU fit the mobo, when you can find a mobo to work with your GPU of choice.
Re:Clearly a first-gen sample (Score:3, Insightful)
First off, the reason it eats 2 slots is because the 2nd slot is used for the blower. If you invert everything exactly where are you going to vent the blower? There's no standardized hole available for this kind of thing.
Second, it would render it incompatible with most motherboards. You'd hit either an I/O header, the CPU slot, or (most likely) support electronics like capacitors and the like. There is generally not a great deal of space between the AGP slot and anything above it because there are minimal (if any) specs requiring distance. A small number of MBs had problems with high end graphics cards right now because of heat sinks on the back of the cards -- they usually end up hitting caps, which is the last thing you want to do (ever short a cap? Not good)
Good Old Video Card (Score:5, Funny)
I wonder if we're ever going to get to a point where "this is the hardware. You have 10 years to do something cool with it" instead of "oh, look, your program is obsolete again! Your graphics are dated! Another 10 man-years down the drain! Place your bets... (spin)"
sigh...
Re:Good Old Video Card (Score:5, Funny)
I'll put $40 trillion on "The Law of Accelerating Returns" [kurzweilai.net], and laugh at you for putting your money on "Moores Law Has To Hit A Wall Dammit!!!!1!!!1" :-)
--
No (Score:4, Insightful)
Only when, if ever, we can render something like the Final Fantasy movie in real-time. Something tells me Moore's "law" will have broken down before that though.
Kjella
Re:No (Score:2)
Re:No (Score:3, Interesting)
What I want is for the hardware to support a realistic and comprehensive physics model in said Final Fantasy universe.
Re:No (Score:3, Insightful)
Moore's law will not have hit a wall by then, but I think you will be able to do your Final Fantasy and Shrek rendering by then... but there will be another couple all-CGI movies about a year before that will elicit the same post as you said, and will be answered the same way: wait 5-10 years, it'll happen.
Re:No (Score:2)
Right now, the average machine is running about 128-256MB RAM (about half of the P4 machines are specced with PC100 or PC133 SDRAM, and half with PC2100 RAM (which is actually 233 MHz, if memory serves). I personally run my desktop machines with PC2100 512MB RAM, and find it works fine.
And yes, I should've made my estimate for the future something more like 2GB RAM.
We're getting pretty close (Score:3, Interesting)
Re:Good Old Video Card (Score:5, Insightful)
There's very little reason someone with a video card made a year or two ago would need one of these. My Radeon 8000 works fine, thanks. $400 for a 10-frames-per-second improvement isn't what I call revolutionary progress.
Re:Good Old Video Card (Score:2, Insightful)
A common sense view of the situation would be: yes, you have a Radeon 8000: you shouldn't even consider a GFFX. The GFFX SHOULD be marketed at people who have Nvidia TNT2's and 3DFX boards: people who are getting to the point where they want to upgrade have an extra option, people who don't need to upgrade shouldn't.
Common sense. It's pretty easy.
Re:Good Old Video Card (Score:3, Funny)
There's very little reason someone with a video card made a year or two ago would need one of these.
Two words: Quake III.
You are right that 98% of games will run on hardware two years old. However, there is a subset of games that demands the latest and greatest hardware to experience the game. There's no "conspiracy" here, just that certain developers aim at the leading edge. If you don't want to play those games, there's no reason to upgrade.
Personally, the day Quake III comes out is the day I upgrade my video card. :)
Re:Good Old Video Card (Score:5, Funny)
You said it, man. When Quake III comes out I'm gonna PARTY LIKE IT'S 1999!
Re:Good Old Video Card (Score:3, Funny)
Oops... I suppose that's probably a freudian slip about the "differences" between Quake and Doom. :)
That's Quake IV or DOOM III (Score:2, Funny)
Personally, the day Quake III comes out is the day I upgrade my video card.
Well, you're a couple years late, better hurry!
Re:Good Old Video Card (Score:2)
Re:Good Old Video Card (Score:2, Insightful)
Re:Good Old Video Card (Score:4, Insightful)
Re:Good Old Video Card (Score:2)
Creator3D & Elite3D (Score:3, Informative)
A bunch of their equipment is designed for a 10 year obsoletion-cycle. Cost's a hefty penny, though. Designed for business and major research universities.
At the University, we were using Creator3D graphics cards from Sun Microsystems. That was in 1999, and the general consumer market still hasn't caught up with that tech. Me, I'm still looking around for auto-stereoscopic monitors. Sharp is coming out with a consumer model next year, I hear.
Re:Good Old Video Card (Score:2)
how do you describe a fly? (Score:3, Interesting)
Or a bunch of flowers.
Is each individual pollen grain to be described?
Will the water eventually splash?
Re:how do you describe a fly? (Score:2)
try jdoom [doomsdayhq.com]
In all honesty you obviously know absolutely nothing about the subject.
In original doom the mobs were sprites. They are not 3d entities. The maps were lit and rendered in an extremely processor intensive way so that when the time came to display them the CPU wasn't bogged down in light mapping.
Why would someone in 1993 decide to include bump mapping and environment mapping and real time lighting and smoke data to a computer game that wouldn't be able to be seen for 10 YEARS!!!
And you are wrong, the industry did evolve in that direction, well two actually. OpenGL and DirectX are systems where the scene is described at a higher level precisely for the reasons you mention [although usually it's for the requirement to degrade gracefully rahter than extend shelf life]
I'm glad that Doom & Heretic et. al. look outdated because I want the next crop of cutting edge stuff.
You cen bet that no-one will be including smell & touch data in their games to extend the shelf life of these products when such technologies exist.
3DFX-like Production Problems? (Score:5, Informative)
Only about as worried as if Intel reported probs.. (Score:4, Interesting)
Disclaimer: I have no idea about the economic status of Nvidia. But I do see them in pretty much every computer advertized, and they've generally delivered very successful products since the first Geforce chip, so I assume they got a strong finacial position. And if you can't solve it even if you got more money to throw after it than the rest, well maybe you deserve being dethroned. That's what competition is all about, isn't it?
Kjella
Ugly little bugger (Score:2, Interesting)
there is this little problem, see (Score:3, Interesting)
it's called silicon real-estate.
it's also called packaging cost.
it's called data routing on the board (FR4 is very, very slow unless you use a LOT of traces, which is very, very diffcult).
I think it may also be called lower MTBF.
and how about "debugging is a pain?"
either way, though - don't expect "multi-processing" on but the most high-end incarnations - when they have squeezed out of every bit of performance per-chip.
Still no dual-DVI! (Score:5, Interesting)
I wish they'd start putting dual-DVI outputs on them. Maybe one of the other companies that makes them (MSI, PNY, Leadtek, etc) will offer one finally. AFAIK they don't even offer a hydrahead adapter for the one DVI port to split to two (doubt its possible without a proprietary output like the Radeon VE's).
Re:Still no dual-DVI! (Score:2, Interesting)
Small market maybe? (Score:2)
Ever stop to think how few people would have two LCDs that use DVI and wnated to waste them on a dual head configuration instead of use them on two seperate machines? It is highliy likely that they wouldn't even be able to recoup the costs of a single run of such cards.
Re:Still no dual-DVI! (Score:5, Informative)
Re:Still no dual-DVI! (Score:2)
Re:Still no dual-DVI! (Score:2)
wow! (Score:2, Funny)
Is it Just Me? (Score:4, Insightful)
Nothing new here (Score:4, Informative)
Pentium : 3 million transistors
Pentium Pro : 5.5 million transistors
Pentium 2 : 7.5 million transistors
Nvidia TNT2 : 9 million transistors
Alpha 21164 : 9.3 million (1994)
Alpha 21264 : 15.2 million (1998)
Geforce 256 : 23 million transistors
Pentium 3 : 28 million transistors
Pentium 4 : 42 million transistors
P4 Northwood : 55 million transistors
GeForce 3 : 57 million transistors
GeForce 4 : 63 million transistors
Radeon 9700 : 110 million transistors
GeForce FX : 125 million transistors
The only interesting thing to me... (Score:5, Interesting)
Historically, haven't onboard TV-outs/ins on video cards been kinda crummy? With the exception of the All-in-Wonders, I thought they were scoffed at by the hardcore PC-on-TV users. Does anyone have any more specs on the TV-out chip? Seeing as I'll swim in rusty nails before I spend $650 on a video card, I'm hoping that a watered down version will be available with the same TV-out... anyone?
Speeds... (Score:2, Interesting)
I mean c'mon...look at even the smallest heatsinks that go into rackmount systems...still going to take up some space as far as a video card goes.
Also...Iwonder what ATI has up their sleeves. They have had a lot of time to get stuff together for what could be their next-gen chip...
These prices are just getting absurd though, but as long as people pay it, the prices are going to continue the upwards trend...
Bah! Humbug! (Score:5, Funny)
Re:Bah! Humbug! (Score:2)
Slots aren't as valuable as they used to be... (Score:5, Interesting)
Non-slashdotted pictures at Toms hardware (Score:5, Informative)
http://www6.tomshardware.com/graphic/20021118/gefo rcefx-03.html [tomshardware.com]
Too little, too late... (Score:5, Insightful)
ATI will simply respond with the R350, which is likely going to be an improved R300 core, as well as DDR2 and manufactured with the
It would be unfeasible for nVidia to respond until the summer with the NV31/34, at which time ATI will announce the R400.
I will have to give nVidia one thing though, their drivers are excellent. This is perhaps the only thing they have going for them at the moment. However, ATI is pumping out a new driver set almost every month, and at this rate, they will soon reach parity with nVidia.
Having a taste of M$ specs (Score:2)
Yuhoo...
Think Water cooling (Score:5, Insightful)
People call liquid cooling dangerious, unneccesary, and extravigant, and then buy video cards that have cooling such as this one, cpu coolers that are enormious, and put half a dozen case fans in their case to try to keep the temperature down.
This will be what breaks NVIDIA, just like 3DFX (Score:5, Interesting)
3DFX used to compete with NVIDIA. When NVIDIA released a new line of cards, so did 3DFX, or when 3DFX released a new line of cards first, so did NVIDIA.
When the GeForce2 cards came out, everyone waited for 3DFX to release their competitive line. About 4 months later, 3DFX released a couple Voodoo4 cards, but not much in the way of competition, and nothing spectacularly advanced above the Voodoo3's. However, they also let out news of plans to make a market breaker card, the Voodoo 5-6000, which would take up fall case length (and bump harddrives), have 5 fans on it, and require an external wallwart-style DC adaptor for power supply. It was a $600 card meant for the mega-gamers and graphic designers out there. This was a huge card... and their biggest flop, for once it came out, NVIDIA was already releasing the GeForce3's which had better specs and lower prices overall.
Now, Nvidia does something just like that. This card is double-height (the second slot worth is ducting for external air intake and exhaust) and is full case length. It's got monster specs, and has thrown off their regular 18-month cycle of new cards. This new one is $600 as well.
Sounds to me like some of the execs of 3DFX have gotten on the board of NVIDIA via the buyout, and are trying to make another Voodoo5-6000. I hope it doesn't end the same way, with this company going down the tubes as well.
Re:This will be what breaks NVIDIA, just like 3DFX (Score:2)
This quote is revealing: "[Nv]: Well, now that TSMC has their production running at
How nice of nVidea to pave the way for their competition! ATI's gonna save millions.
Significant amount of time to switch. Er, yah. Right.
Re:This will be what breaks NVIDIA, just like 3DFX (Score:2)
Re:This will be what breaks NVIDIA, just like 3DFX (Score:4, Informative)
Re:This will be what breaks NVIDIA, just like 3DFX (Score:2)
Re:This will be what breaks NVIDIA, just like 3DFX (Score:3, Insightful)
You can't just take a current chip design, shrink it from the 180nm to the 130nm process, and expect it to run. If it does, it would be a miracle of a cosmic sort. As far as changing processes go, it's somewhat reminiscent of taking an SUV, and pulling out the engine, and putting in an electric motor-- and expecing everything to work fine, except 'faster' or 'better'. Ain't gonna happen
Most chips are written in a HDL (Hardware Descriptor Language); ATI and nVIDIA use, among others, VeriLog and VHDL. Both of these languages have their behavioral-level code, which is somewhat reminiscent of a traditional C program. (Make no mistake, HDL's are a totally different ballgame to a programming language). Then, after you have the behavioral code working (meets timings, etc.), you synthesize (compile) it.
Here's where it gets tricky:
Synthesis involves taking your process (fab size, power, material, and other characteristics), and create an optimized layout of gates to perform the tasks described by the behavioral code. The synthesized code almost definately does not behave exactly like the behavioral code-- but the synthesized code is close enough -- just barely, to meet the critical timings, and the whole thing works.
Quite often, the synthesized code will utterly fail, and the offending part will have to be identified, diagnosed, and fixed. But the fix will probably break something else. It's like putting carpet in your bedroom, and suddenly the ceiling caves in. Fix the ceiling, and the walls turn pink. Repaint the walls, and the bed becomes sentient.
The thing to remember is you get used to the 'personality' of a given fab process, and begin to pre-emptively put in fixes to avoid seeing them at all. But the instant you change fab processes, the entire 'personality' of the synthesis changes, and all bets are off. The entire design will have to be re-synthesized, re-simulated, and re-debugged. And that's before it hits silicon.
Re:This will be what breaks NVIDIA, just like 3DFX (Score:5, Insightful)
Volume.
ATi doesn't ship lots of chips to be sold to OEMs on the cheap. nVidia does, and will still do. This was 3dfx's problem, and this will be what keeps nVidia alive. Whether or not it'll keep them competative or have them go the way of the Trident or not is another story.
I'm of 2 minds... (Score:2)
On one hand, it's a powerful piece of hardware if any of the hype we're getting fed is remotely accurate.
On the other hand, is it really a good idea to completely reinvent the wheel? Have we really pushed the computing power available to us in the old methods of rendering things in 3 dimensions?
Sneaky... (Score:4, Interesting)
I think I'll stick with my radeon. If the fan quits, I'll just replenish the oil.
Kudos to Nvidia, though, for finding a way to force their users to buy new cards in the future! This'll certainly be the wave of the future, like fibreglass bodies on cars!
Hmmm... (Score:3, Funny)
Reminds me of a moded Voodoo 3000 (Score:2)
maybe when they move to a smaler interconect size (what they're using in this card ? 0.13 micron ?) it runs cooler. then I'll buy.
Tweaking has gone mainstream (Score:2)
Factor in the power requirements of the Athlon/P4 processor, and this is getting ridiculous.
I would love to see some of the laptop power/speed control features in desktop systems. For example, run the CPU's at 800MHz when I'm browsing the web, and goes to 2.4GHz when I need the power (with the cooling fans adjusting accordingly). Of course, a video card with passive cooling is also a requirement for me.
Faster is slower (Score:4, Interesting)
The chips are very slow to switch from text to graphics and vice versa.
I had a board with a slightly older Nvidia chip set. I wasn't very satisfied with the stability of the Xfree drivers for it so I tried the Nvidia Linux drivers. Their driver took five minutes to switch between text and graphics modes.
Older chipsets were much more practical for day to day use; the super speed models remind me of trying to drive a AA fuel dragster to the office every day.
Re:Faster is slower (Score:2)
Re:Faster is slower (Score:2)
Yes, I am aware that the Nvidia written driver for Linux was the cause of the ridiculously long switch time.
Early in the history of accelerated video cards it was pointed out that the faster they got for graphics - the slower they were in text mode. The very fast processors we have today mask that particular problem.
WHY WHY WHY WHY?? (Score:5, Interesting)
Why hasn't anyone put the GPU on the OPPOSITE side of the card yet? Every AGP card I see, the GPU is ALWAYS facing towards the PCI slots in the system where it.
A. Blocks out other PCI cards
B. The fan causes noise and instability if it is running too close
C. It exhaust the heat onto those other cards.
Instead of trying to put the carridge before the horse, why not just mount the GPU on the opposite side? There's no PCI slots to get in the way, and you could fit a HUGE cooling solution there.
Hey Nvidia if you want to hire someone with more common sense design tips like this i'm availiable. I'll slap your engineers with a cluestick for ya.
Re:WHY WHY WHY WHY?? (Score:4, Informative)
Re:WHY WHY WHY WHY??Pic included (Score:4, Interesting)
I thought I'd take a [zeromag.com]
picture and make a rebuttal to your statement. Gotta love digital.
In this pic there are 5 mobo's.
Intel 850GB
Some asus socket370 thing
Some soyo socket370 thing
Iwill BD100 slot1
Some intel socket370 thing
You will notice on the asus board I put a tape measure across as a reference.
Now out of the 5 boards sampled, only 1 has no space for heatsinks on the right
side. Also to note this board is a slot1, which is no longer in production.
On the other hand, every single semi modern board in this picture has more
than adequate room for heatsinks on the right side.
So unless these newer cards are going into an outdated system, putting the
fans/heatsinks on the right side shouldn't be a problem right? Simple enough
solution without having to resort to heat pipes/water cooling or peizo electric
cooling.
Re:WHY WHY WHY WHY??Pic included (Score:2)
I thought I would point one more [zeromag.com]
thing out
There is an extra slot to the right of the AGP slot, I have the area circled /. crowd, how many other people out there
in white. Quick question for the
have a case with an unusable slot on the right like me?? Seems to make
perfect sense to put the fan there doesn't it?
Re:WHY WHY WHY WHY??Pic included (Score:2)
> Why hasn't anyone put the GPU on the OPPOSITE side of the card yet?
Very easy - heat flows upwards - the card itself would block the heat stream.
And of course the electrons would fall off
Re:WHY WHY WHY WHY??Pic included (Score:2)
Mid left, so you have to leave about an inch open on the bottom of the cooler, again not a problem.
Bottom left, you would have to channel the exhaust back into the case, if you have adequate air flow, not a problem.
Top right, We agree here, but I did say that slot 1's were no longer manufactured so that was a moot point.
Bottom right, come to think of it, a internally exhausting fan wouldn't present a problem here either as long as the cooler had adequate clearance over the ram.
I could also link to every mobo manufacturer where the installation instructions say "Install your CPU and RaM first" but i'm too lazy and I think you get the picture. If the cooler is properly designed it could accomidate %90 of all socket370 boards (sorry I don't do AMD so I can't speak for them there)
Re:WHY WHY WHY WHY?? (Score:2)
On the otherhand, I have seen some motherboards that stick big capacitors right above the AGP slot which would cause problems for your "HUGE" cooling solution.
Re:WHY WHY WHY WHY?? (Score:5, Insightful)
I would rather it blows the hot air to the other PCI cards than to the CPU. Most modern CPUs are already hot enough by itself. So putting the GPU on the other side will essentially blow the hot air towards the CPU, which would make it hotter still.
THe premise of video card is obsolete.... (Score:3, Interesting)
The graphics card arena has been a major exception to this for the last few years. It's one of the few industries that I can think of where the product is actually GROWING in size and becoming more combersome as the technology becomes increasingly faster and more complex. I believe this is a sign that, not unlike how we discovered in the Pentium II/III era, that card/based processor packages are poor product design that are a) larger than necessary b)gum up the works, and c) only enhance the problem of cooling, thus needing continuingly more complex cooling systems.
The current AGP(or PCI or whatever) bus expansion card methodology for video cards can be seen as going through the same problem, especially in the case of the GeForceFX. We've seen these problems previously in the designs for the GeForce3,4, made much fun of them in the case of the 3dfx Voodoo5 6000 cards, and even the latest ATI cards are requiring more power than the AGP bus can provide. Doesn't this show that there is an inherent flaw in the packaging design for this technology?
GPUs need to take the same road that CPUs have taken (and now restored since we now use socket based motherboard solutions again) and be sold soley as the graphics processor, with the memory substructures and soforth built onto the motherboard. This increases the efficiency and ease that the GPU can communicate with the central bus and the rest of the system. In addition, you will no longer need to build an elaborate cooling strucutre to make up for the lack of ventilation provided by the typical AGP/PCI card slot design.
Nvidia is part way there with the NForce already, building the graphics subsystem as a central part of the motherboard chipset and PC bus, but the flaw here remains (as in most integrated motherboard systems) that you are stuck with the technology. Of course, you can upgrade an NForce system with a full GeForce4 FX or Radeon if that is your choosing, but that just brings back the card problem. What needs to be done is to create a NForce type chipset with an FCPGA type socket for the GPU as well as the CPU, that way both systems are imminently upgradable (not to mention the potential benefits in creating a more efficient in-line cooling solution for the interior of the system) and thus our size problems begin to be alleviated.
Re:THe premise of video card is obsolete.... (Score:2, Insightful)
(Actual clock speeds, not doubled DDR speeds). A drastic reworking of the motherboard layout, and a considerable increase in complexity, would be required to properly support this.
Then you get issues with the socketing standard - how long with ATI, nVIDIA, and everyone keep playing ball with each other? How long before nVIDIA leans on a motherboard manufacturer, using their nForce chipset, and creates a non-standard socket? Power requirements, as well - Will the motherboard be able to power the chip, or will we have to plug in a lead from the powersupply akin to these new powerhouse cards?
Interesting upsides to the situation would include the potential to use a G4 Mac style dual-, or perhaps quad-processor modules, for increased processing power - but that has the potential to easily saturate the bus, also bringing us back to the original concept of having everything mounted on an independent board module.
Re:THe premise of video card is obsolete.... (Score:2)
In this case, a socket format would only make matters worse. One advantage to being on a card is that both sides of the card have airflow, dissipating heat. For the most part, a video card is a GPU and memory. Other stuff figures in, but the problematic part with the FX is the GPU cooling requirements. The presence of extra memory isn't the problem. The best solution would be a spec that *requires* more space between the AGP slot and nearest PCI slot. The rule of thumb for a long time has been that a PCI card next to the AGP slot is bad, this design simply changes that rule of thumb into a hard requirement. It seems sloppy for external power and a waste of PCI expansion, but a socket format won't fix anything.
for those who dont want to take up a slot (Score:2)
anticipating rear exhaust/cooling, the 2nd to top lines up with a slot, and the top one just has the cut in the back, in case you have a card that needs it.
chances are youll only have one card that uses a dual slot, and that leaves the rest to be used normally.
the people who always put the big ass power supply at the very end of the strip so the rest of the plugs can be used will understand what im talking about
Re:for those who dont want to take up a slot (Score:2)
I love the comment under the images. . . (Score:3, Funny)
How apt!
not just FPS anymore (Score:5, Insightful)
The best thing about the FX isn't the overall FPS per second. It is the pixel shaders and such. The number of instructions it can excute per shader, and the rate at which it processes these is the real evolution of this card. The more complex the shader and the faster they run the more life like graphics will look.
We have been stuck in the same basic quake engine for a while now. Unreal II and Doom 3 ( doom3 more ) will be the first real change in graphics we've had. Now the GPU's can handle movie style rendering, without a ton of little tricks.
We really do need the horse power. The FX could probably render toy story in real time, that is pretty amazing. I can't wait till I can watch a movie and pause it and change the angle. The ability to have true 3-d movie projection is becoming more realistic with this type of hardware ( of course we need the 3d projector )
$400 dollars for this is nothing. You don't seem to realize that just 10 years ago a 486 DX system could cost over $4000 grand. With 16 megs of ram and 1/2 gig of harddrive. The price is rather low considering what it takes to create such wonders, stop bitchin.
Open source will help out in this arena as well. You got to think that the pros that did the work on Golem for LOTR are fans of open source, it won't be long until those kinds of shaders and techniques will be available for game programmers.
To me saying "why do we need all this power" is kind of sacreligous. Remember that increasing speed and creating a market for new hardware is what keeps most of us employeed. Never say more speed is a bad thing. And don't blame sluggish performance on the developers, as software becomes more complex you have to give up some performance for stability and expandability.
Re:not just FPS anymore (Score:5, Funny)
4000 grand??? I think you paid too much.
It's $399....It's $399......It's $399...It's $399 (Score:2, Informative)
Best Buy preorder [bestbuy.com]
nvidia lost this one (Score:2, Insightful)
Re:nvidia lost this one (Score:3, Insightful)
It was nVidia's move to 0.13 micron that delayed the GeForceFX, and allowed ATI their moment in the sun. ATI have yet to climb that particular hill, and nVidia are already rolling down the far side.
Re:/. -ing in progress (Score:5, Informative)
http://boris.st.hmc.edu/~jeff/nvidia/ [hmc.edu].
Hope that helps.
Re:Is this a 3-slot monster?! (Score:2)
Looks like more 2 slots than 3. See picture #1 [hmc.edu].
On picture #4, from top to bottom, I see (1 and 2) the actual card, (3) a metal plate (eg. no card), (4) a sound card and so on.. Mabye you mean that the metal plate is in place because you can't fil anything else in this space ?
Re:Genuinely curius (Score:5, Insightful)
you're kidding me, right? Get UT2003 in 1600x1200 with everything maxed out and even on a 9700pro you won't see more than 30 or so fps on certain maps (alone, just looking around, in heavy firefights I'd suspect it'll drop to the teens: I don't have a system like that but I'm basing this on vidcaps I saw when UT came out).
Human eye is unable to perceive extra frames beyond a certain number
bs, it also really depends on what you're doing. If you're in a driving game going straight ahead and you get 30fps, you *might* not notice the difference between your 30 and 90fps. In a shooter or other game where the screen moves around quite a bit, I'm sorry but I can see the difference between 30fps and 70fps quite easily...
The moment somebody creates a card that is able to mantain refresh-rate-synced-updates (say 85fps) in any available game at any resolution regardless of what is going on, it's the moment a new game will be announced that will take a card 4x as powerful to do the same.
It really never ends... of course if all you'd like to do is play counterstrike you can get by quite well like myself with a really old p3-450 + geforce1.
Re:Genuinely curius (Score:3, Insightful)
It's interesting to note that the US military has done extensive testing in this area, specifically so that they can build simulators as absolutely 'real' as possible, and not produce any extra frames (and the increased cost involved in delivering them). According to a few engineers from Evans and Sutherland, who at least used to build the image generators for them, the vast majority of fighter pilots were unable to distinguish between framerates above 60fps.
Of course, then there's the whole 'aliasing' you get whenever you actually have a 'frame-based' video, compared with 'real life'. Case in point: Ever notice how helicopter blades, propellors, wheels, etc. seem to spin 'backwards' on TV? It's sample aliasing. Even your own eyes see this whenever your light source 'blinks', which is the case in nearly all artificial light. Take a bicycle tire, put it between your eyes and a flourescent light, and spin it; you'll see the aliasing artifacts with no problems. Take the same bicycle tire outside (in sunlight), and do the same thing-- no more aliasing!
To realistically remove all aliasing, we'd have to have much higher framerates than 60fps; however, it's generally considered a 'normal' thing, since we grew up seeing it, and nobody fusses about it.
Re:Genuinely curius (Score:5, Informative)
There is also the fact that these are "average" frame rates: if your average fps is 30, you're going to quite often be getting sub-30 fps, resulting in jerkiness. So the ideal FPS is somewhere around an average of 75-135, so as to remain in perfect smoothness. (this refers to your question about why a gamer would want a new card).
Re:Genuinely curius (Score:2, Informative)
Part of why film [at the aformentioned 24fps] seems smooth is that motion blur is recorded on the film: when an object is moving too quickly for hte light to capture a still image on the film [due to exposure], it captures a blur. Our brain loves to use that blur to assemble motion. Since computers lack this motion blur, they need more fps.
Re:Genuinely curius (Score:2)