Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware

nForce2 Preview 252

An anonymous submitter writes "I noticed that a review of NVIDIA's nForce2 chipset has been posted here. From what I can gather the chipset contains two 10/100 ethernet controllers, six USB 2.0 ports, UltraATA133 support, three 1394 ports, five PCI slots, and an integrated GeForce4 MX core including NVIDIA's nView technology and a TV Tuner." Tom's Hardware and NVNews also have looks at it.
This discussion has been archived. No new comments can be posted.

nForce2 Preview

Comments Filter:
  • According to the NVNews article, they have a reference motherboard?

    Perhaps this has something to do with cooling ... if you look at the pic, the AGP slot is very close to the next PCI slot - perhaps the weight/size of cooling equipment makes the AGP slot impractical for the most powerful graphics chipsets.

    • According to the NVNews article, they have a reference motherboard?

      That's not what it says. If you read closely, you'll see that they tried to simulate the performance of an nForce2 mainboard by using an nForce motherboard with an underclocked GeForce4MX 460.
  • This chipset is designed to be used in OEM boards for good performance and enthusiast customers...not servers. I can't think of any legitimate use for *two* ethernet controllers other than in a broader network application (Firewall, for instance).

    Perhaps you could use it to make a really stupid sort of bus network for LAN parties using nothing but crossover cables, but that's such a silly idea (performance/configuration issues) that it's probably true...
    • It's for motherboards where space is limited too. I'd like a server motherboard that has the integrated video for instance, since I don't care about keeping latest-greatest video support there, but definitely don't want to waste space for a video card.

      By building it into the chipset, they allow a variety of different motherboards for all sorts of applications to be developed using just one chip.

    • > Why two ethernet controllers?

      Why not? It's also said to have 6 usb ports, which is about five more ports than most people ever need.

      If the cost of adding a second port is very small, there's no reason not to do it. Saves joe power user some time and money when he realizes he needs a second port, and joe average user will never be harmed by having it there.
    • Well, firewalls are becoming very important as more people get "always on" internet access. In addition the number of people getting a second computer is increasing. Plus, chances are they are going to use some RealTek POS [freebsd.org] chip on there, so the second controller is only going to cost a few cents.

      Also, this might be useful at LAN/fireshare parties where people don't want to saturate the "gaming" network so they set up a second firesharing network.
      • Actually, apparently nvidia makes the integrated Ethernet port (integrated in the motherboard chipset), and 3COM makes the integrated, but circuitry isolated, "add-in" card. 3COM makes superb cards and chipsets.
    • I can't think of any legitimate use for *two* ethernet controllers other than in a broader network application

      I can think of two.

      • DSL modem (many of which use ethernet) and a regular LAN. I am told you can just put the modem on the LAN, but on my Linux gateway I could not make that work.
      • Thin client - very handy even at home, use your clunky old PC as an X-Terminal onto a server, 100BaseT private connection de-congests the (maybe 10BaseT) LAN
      • Cheers, Andy!

        • DSL modem (many of which use ethernet) and a regular LAN. I am told you can just put the modem on the LAN, but on my Linux gateway I could not make that work.

        Works just fine here, although with FreeBSD. I'm sure it would work with Linux too, but don't know how exactly. On FreeBSD one network interface can have many addresses -- I have one external (assigned by the ISP) and two internal (10.0.1.x).

        On Linux, you have to "clone" the interfaces to achieve that (eth0.0, eth0.1 ?)...

        The DSL modem then goes into the switch together with the rest of your network, under the assumption, that the ISP is smart enough not to let the private-network packets through their routers, so noone can target your private LAN directly.

        The firewall rules become quite complicated, though and by using two separate physical interfaces you aleviate both issues: having to rely on the ISP's wits and the firewall spaghetty.

        I know, people think, the little NAT-routers sold by everybody and their brother are more secure, but they all have useability problems. Mostly their NAT implementations suck -- typicly, you can not ping or traceroute through them, sometimes ftp-ing is troublesome. Prolonged idle tcp connections sometimes get closed out of the blue, etc.

    • The "controllers" are just a few extra pins out of hundreds on the chip. A very tiny patch of real estate is lost if you don't use it. A great amount of circuit board real estate is gained if you do use it as it only requires a few tiny inductors and connectors to impliment it.

      I'd want to see a few more controllers on the chipset myself. What's another milliwatt and a few more pins among friends? Imagine the clustering potential of these chipsets...
    • Ummm Cable/ADSL companies that require you to use the provided one port router with only one active DHCP address - where you may need to set up one machine as a NAT and firewall.

      This was certainly the case with NTL in the UK. I had one workstation, one lappy and another machine I use purely for downloads - I setup the download machine with two PCI ethernet cards for just this purpose.

      However this was around 2 years ago, and currently you can now buy ADSL boxes with built-in DHCP, routing and NAT fairly cheaply(I think theres even one that doubles up as a cheap wireless access point).

      Which still leaves the Firewall - have you ever seen how expensive an outboard hardware firewall is(given I already have PC's that could do this)? I could set up a seperate software firewall on each machine in the network - but what a pain in the behind when I can set up a machine in no time - then having two ethernet ports is a boon.

      However in the case that the user has a standalone system all of this is redundant - but since it is so cheap, and any reasonable OS will allow either port to be used(saving the "Which port" on tech support) I cant see any problem with it being there.

    • suppose this was used in an xbox-like device. currently you can connect 2 xboxen together for 4x4 gaming. wouldn't two ethernet controllers allow it to connection both online and to another xbox? this is certainly a very good thing.
    • I could think of plenty of reasons. Access to multiple lans (our shop runs an office lan and a network/server lan), access to controlled networks running IPSec, etc.

    • Internet connection sharing. I have a Dual Port NIC in my system for this exact purpose. Sure, the box is firewalled too, but it's main function is to distribute one connection in, to the other two systems in the house via a HUB.

      I'm much too cheap to buy the extra IPs or a router. I had the hub and the NIC sitting around (salvage sales - woo!) and it was an easy way to avoid paying $20 / month or $100 for an actual router.

      I'd like this board on this grounds that it saves me a PCI slot or two. My machine is all filled up right now anyway. I couldn't put a second NIC in anyway. I was glad to have the dual port card.
    • This chipset is designed to be used in OEM boards for good performance and enthusiast customers...not servers. I can't think of any legitimate use for *two* ethernet controllers other than in a broader network application (Firewall, for instance).

      Well, there's apparently a lot fo different uses for having a second ethernet port, and I won't bother to list them all here (as others have done so for me). My question is why two different ethernet controllers? If you've already got your own controller built in once, why not just duplicate it? Why license someone else's controller? Granted, 3Com makes pretty solid NICs (they're all I use) but if you're concerned that yours aren't up to snuff, why not just license 3Com's from the beginning? It would certainly simplify things from a configuration and support perspective anyway.
  • "nForce Preview"
    Read: The guts to a watered down version of Xbox2. (I presume nForce3 or whatever will power the X^2)
  • Processor? (Score:1, Funny)

    by hofer ( 84209 )
    Do I need a processor for this?
    • Not sure you are meant to be as a flame or not, but yes you do need a CPU for this, an Athlon CPU to be exact.

      IIRC, I don't think NVIDIA has licence for making Pentium 4 chipset, so they are pretty much stuck with AMD's processors for now (I think they are making chipset for Hammer as well)

      I think nForce 2 is great but watch out for ATi's new chipset as well. (I smell a chipset super pricewar in the distant future!)
      • Sorry to nitpic - but you mean an AMD processor -AFAIK a Duron would also work with it would it not?

        You do wonder if this level of reintegration may mean that the option of buying throw-away single board PC's is being re-explored. Its certainly not what I want, and many slashdotters like me find the idea very repulsive. But it would generate great revenue for manufacturers, and your mother/non-techie PC user would probably love cheap smaller footbprint single board machines.

        Does anyone here remember the appeal of the ZX81 being a single board computer?

        Of course we must make sure that the hardware seperates market doesnt die....Otherwise what would I tinker with when bored?
        • I think Durn works fine too (Duron is just an Athlon with less cache and running @ a lower FSB)

          Yeah, I used to think this way too, but think about it, Sound, Ethernet and ATA controllers are pretty much the same nowdays and NVIDIA's sound is pretty decent to begin with (I think they are using the same XBox's logic)

          Onboard Graphics is a big turn off since it pretty much close off your upgrade path but I think NVIDIA is doing the right thing by including a lot of PCI slot and an AGP slot for future upgrade. Let's hope the board will get cheaper.

          Other than that, I think it's an extremely attractive option for OEM people since you get a highly intergrated mobo with a NVIDIA brand right on it.
  • RTFA (article) (Score:5, Informative)

    by carlcmc ( 322350 ) on Tuesday July 16, 2002 @09:23AM (#3893431)
    to quote: "Because the traditional modem is being replaced by more modern technologies, such as DSL, a network card is pretty much indispensable these days. However, a single interface is only sufficient if the PC is to be connected to either the Internet or a a local network, but not both. If you need to connect the PC to both, then you definitely need a second port."

    for a home with more than one computer with a cable modem this makes perfect sense. For a couple dollars more, it would be stupid not to...

    • No... This relates to Xbox2. Microsoft will be getting with cablecos in order to provide an integrated set top box/router so customers can have multiple PCs through a connection that can be managed/supported by the cableco. It is a great idea, if you ask me.

      Xbox/Tivo/Router/Web Server - all managed by Big Brother and everyone will pay extra. MS and cableco will have a piece of the additional revenue.
  • by teamhasnoi ( 554944 ) <teamhasnoi AT yahoo DOT com> on Tuesday July 16, 2002 @09:27AM (#3893460) Journal
    due to the newfound explosive nature of mice, I'm going to skip out on buying hardware for the time being.
  • by imr ( 106517 ) on Tuesday July 16, 2002 @09:31AM (#3893491)
    On the topic of current Nvidia cards:

    Do not buy a GeForce4-MX for Doom.

    Nvidia has really made a mess of the naming conventions here. I always
    thought it was bad enough that GF2 was just a speed bumped GF1, while GF3 had
    significant architectural improvements over GF2. I expected GF4 to be the
    speed bumped GF3, but calling the NV17 GF4-MX really sucks.

    GF4-MX will still run Doom properly, but it will be using the NV10 codepath
    with only two texture units and no vertex shaders. A GF3 or 8500 will be
    much better performers
    . The GF4-MX may still be the card of choice for many
    people depending on pricing, especially considering that many games won't use
    four textures and vertex programs, but damn, I wish they had named it
    something else.

    (all this comes from carmack's .plan:
    http://webdog.org/plans/1/ )

    It seems nvidia is going the same road as intel and sis with their cheap video-on-board motherboard. All of them sucked! Good luck!
    • For the majority of home users (who aren't power-gamers), a GeForce 4MX is just fine. And there is an 8x AGP slot for the power gamers...
    • t seems nvidia is going the same road as intel and sis with their cheap video-on-board motherboard. All of them sucked! Good luck!
      Except that their on-board video doesn't suck; granted, a GF4MX420 is not the top of the line, but would you really want a motherboard that costs an extra 400$, that is going to be wasted in 6 months? Didn't think so.

      Compare the onboard video with any other on the market, and you will notice, it does anything BUT suck - it wipes the floor with them, and it can probably do the same with quite a lot of budget cards out there as well. Notice the "budget" in that sentence before you fly off your bat.

      Not to mention, that you have the option of having dual vga/dvi output PLUS tv-out. I don't know about you, but compared to the "external" gfx-card I have now, that's a lot better! Not to mention that it's also a lot faster than my current GF2 MX400. No - I don't need to play Doom ]I[, and if I did, I wouldn't buy an mb with integrated graphics, and frankly - your idea that anyone would is an insult to their intelligence, no matter how appropriate such observations might be.

      I think you need to hear three little words, that no one have told you in a long time:

      GET A LIFE!
    • by ocbwilg ( 259828 ) on Tuesday July 16, 2002 @11:37AM (#3894628)
      It seems nvidia is going the same road as intel and sis with their cheap video-on-board motherboard. All of them sucked! Good luck!

      Hmm...maybe that's why nVidia also makes an nForce2 part that doesn't have integrated video. Oh wait, you'd have actually had to read the article to know that. Nevermind.

      Seriously folks, integrated video is not always a bad thing. When I built a system for my father I used an nForce board because for $120 I could get a system with onboard video, audio, and ethernet. If I had bought a non-integrated solution it would have cost me over $200 for components of similar quality separately. Does my father need screaming fast graphics power or Dolby Digital 5.1 so that he can play Doom 3? No, an nForce was more than adequate. All he wants to do is browse the web, send emails, work on his geneaology database and VPN into work so that he can do his job (UNIX tools development for Lucent).

      Now with the nForce2 there's another option for me. If I want I can get an nForce2 board without integrated video that still takes advantage of Dual Channel DDR400 (how many other mainboards have that?) and has high-end audio, USB 2.0, Firewire, and dual ethernet controllers built in. Then I can go out and buy a GeForce5 (or whatever they want to call it then) and have a screaming gaming system.

      What would be really nice is to see this in one of the new Shuttle SS-series systems.
      • Well, to browse the web, send emails, work on his geneaology database and VPN into work so that he can do his job (UNIX tools development for Lucent), you just don't need 3D power to begin with.
        A good pci card would be enough, but that's not the point.
        Also that nVidia also makes an nForce2 part that doesn't have integrated video is not the point either.
        There was 2 points in my post:
        first/ they call geforce4 a video card that is not even a geforce3. That is not my point. It's carmack's point (let's give credits where it's due.)
        second/ I did not build a cheap box for my father with an integrated motherboard for his emails and surfing. I built a chep box using spare parts, but real stable hardware. Why? Because I did work in a small compagny where we used to sell cheap boxes. And i had to help troubleshoot those boxes. And integrated sound was already bad (via chipset!!!), but integrated video was hell. To the point that we refused to build them anymore after a while.
        So a real "good luck" to nvidia to succeed in delivering a good cheap product in this area. And a real "good luck" to all repair guys and salesmen and customers if they don't.
        But this geforce4 name is already a bad start:
        "why is my doom3 choppy? It's a geforce4 that's inside. That's what you told me when i bought it. whine. whine. complain. complain... more whine... more complaints...."
        those were the points in my post, but I understand yours about building a good system for less money.
      • Seriously folks, integrated video is not always a bad thing.

        I agree, infact I'm reading this on my onboard prosavage. Of course I wouldn't need to do that if my GeForce2 fan hadn't seized last week! Damn you Nvidia, Damn You.

        I will still buy a GF4 thou :-/
  • Of course, I didn't read the article...
    Is there any support, planned or actual, for load balancing the on those dual NICs? Like the old Znyx multiport NICs?
  • Is it just me, or is NVIDIA trying to it's damndest to piss off Intel? reading the news section on www.nvidia.com, it seems like every 5th sentance states how they are working with AMD to produce this, do that, etc etc. What about Intel? I personally prefer AMDs, but it seems kind of strange that NVIDIA would ignore the larger of the chip manufacturers.
    • Maybe because AMD is cheaper and working with them will entitle them to even cheaper AMD components? They may have to put up with the Intel Premuim if they decided to go with them. Why go with Intel (more expensive, less performance per GHz) when you can go with AMD (much cheaper, better performance per GHz)? The choice is clear.
    • Re:NVIDIA and AMD (Score:2, Interesting)

      by benzapp ( 464105 )
      Intel has always been a little hostile to anyone else producing chipsets. I believe with the Pentium IV, Intel has forbidden any third parties from producing chipsets. They went after VIA hard over this issue. I don't think Nvidia wants to deal with that sort of bullshit.

      I also believe that Nvidia realized something with their Geforce 3, the damn thing was more advanced than many of the CPUs available at the time.

      We are at an interesting point in computer history here, with graphics chips being as advanced or more so than the CPU, it is only natural that the two be brought closer together.

      AMD and Nvidia seem to be doing that, while Intel is not really paying attention.

      • by ocbwilg ( 259828 )
        Intel has always been a little hostile to anyone else producing chipsets. I believe with the Pentium IV, Intel has forbidden any third parties from producing chipsets. They went after VIA hard over this issue.

        Intel actually licenses the IP necessary to design chipsets for the Pentium IV. The reason that they went after VIA for making a Pentium IV chipset was because VIA didn't go to Intel to get a license for the technology. VIA claimed that when they acquired S3 they also acquired the license to utilize the Pentium IV bus technology (since S3 had a license), and it's been fought out in the courts since then.

        Regarding Intel's hostility to third-party chipset makers, that only makes sense. After all, making chipsets for their CPU's is a large portion of Intel's business. By licensing their bus protocols to third parties Intel is making sure that they get a cut of every Pentium 4 chipset sold. They're also raising the costs of competitors chipsets to put them roughly in line with their own. Given the choice in that situation, most people would go with Intel.

        Also keep in mind that controlling the chipsets also allows you to control the technology that is used in them. The Rambus memory fiasco is an excellent example of that. Rambus turned out to be an expensive dud on the early Pentium 4 systems, but Intel was contractually obligated to support only Rambus RDRAM memory and no other memory type on the Pentium 4 for a certain period of time. During that time VIA was producing a less expensive and better performing SDRAM-based chipset for the Pentium 4. Most people went for VIA chipsets on their Pentium 4 systems and that was hurting Intel's chipset business, so Intel of course attacked in any way it could.
  • Good/decent expansion support with the PCI, FireWire and USB 2.0 slots/ports... And honestly, onboard NICs aren't THAT bad...

    But a GeForce4 MX? Dear god! Any Ti model (that's ANY model, be it GeForce2, 3 or 4) would have been better!

    Not that I'm one for integrated graphics anyway.
    • On-Board video doesn't have to be expensive. It's never going to always be the most modern display technology - it's just best to make something that will function.

      People will upgrade the video, but better to have something on-board than nothing especially for special uses where running the latest games is not the purpose of the machine. It's all about making the chipset more configurable for different purposes.

    • And then the board would cost $200 more. You want super duper graphics for gaming, then buy a real GeForce card. I'm sure they thought about this for 10 seconds and came to the realization that the increased cost would decrease their market significantly.
    • ...you would realize they releasing a verion with integrated video and a version without.
  • I tried to read that artical, but the flashing "you've won" advertizement was making my eyes bleed.
  • MX Core? (Score:2, Troll)

    by AAAWalrus ( 586930 )
    *snip*
    integrated GeForce4 MX core
    *snip*

    My question is this: who are they trying to sell this to? Not gamers, since a GeForce4 MX is a stripped down, cheaper version of the real powerhouse GeForce4 TI, which is the new bar for nVidia cards. Obviously, they're not selling this to power users who build their computers piecemeal, because, well... an integrated board by definition defeats that purpose.

    Granted, gamers isn't where the money is. The money is in getting someone like Dell or Gateway to use this board in their corporate lease computers. By convincing big manufacturers that the overall cost of making a computer is lower by buying one big all-in-one solution board, they hope to break into new markets.

    *snip (from review)*
    aimed not only towards the high end but the mainstream
    *snip*

    There you have it. It's a great product, but if you're a typical slashdotter, you're probably not going to care because:
    a) nVidia Linux support has been a bit shoddy (IMHO - although the fact that they have drivers at all is a positive note)
    b) it's not high end - it's a glorified GeForce2
    c) it's integrated, meaning hard to replace if something goes out and not customizable
    • Re:MX Core? (Score:3, Interesting)

      by ghjm ( 8918 )
      I'm guessing that when you buy a computer, you aren't spending your own money.

      The nFORCE concept is to capture low-end market share by providing much better specs than the alternatives, for people who are price-constrained. Suppose you had $400 to build a computer (not including the monitor). The nFORCE architecture is by far the best deal you can get. At this price point, a GeForce4 Ti was never in the cards anyway.

      What nVIDIA has recognized is that the traditional price points for high-end ($3000+) or even midrange ($2000+) PCs have gone the way of the dodo. Ultra-cheap PCs are such a good deal for the majority of buyers that that's where most of the market share is going to be in a few years, if it isn't there already.

      -Graham
      • The nFORCE concept is to capture low-end market share by providing much better specs than the alternatives, for people who are price-constrained.

        I'm not sure that's the case. If it were, the nForce2 chipset would have more expensive higher performing optional components. Would you like your Southbridge to have firewire, dual nics or a Dolby Digital 5.1 audio processing unit? If not, they have a stripped down version for the cost conscious. Do you want your Northbridge to have a GeForce4MX core integrated, or would you prefer one without the integrated graphics so that you can choose your own card? Either way you're still getting Dual Channel DDR400 memory interfaces, a performance option only available on nForce boards.
    • Not gamers, since a GeForce4 MX is a stripped down, cheaper version of the real powerhouse GeForce4 TI

      This is like saying "Why buy a 450HP car when you can get one that's 500HP?" Performance has gotten all but irrelevant. Price and form factor are what matter. If this will let manufacturers create smaller, cheaper, cooler running computers, then that's great.
    • "since a GeForce4 MX is a stripped down, cheaper version of the real powerhouse GeForce4 TI, "

      As posted many times by many people, the GeForce MX 4 is a GeForce 2 core with a higher clock speed. The only reason it has GeForce and 4 together at all is because the marketters at nVidia knew they could sell more parts.

      Read Carmack's .plan about it: http://webdog.org/plans/1/ [webdog.org].

  • looked at all three of the articles and darned if i saw more than a listing of the lower end board at $100, which would be about right. The higher end board would be perfect for a lan party rig, or a computer for your stereo, or some such, if the price was right.
  • by levik ( 52444 ) on Tuesday July 16, 2002 @09:42AM (#3893572) Homepage
    The upcomin nForce 3 chipset will include a built-in heat sink, firewall and a nifty solitare game.

    Plans for nForce 4 (still some time in the future) include an embedded version of Java and/or Internet Explorer

  • An anonymous submitter writes "I noticed that a review of NVIDIA's nForce2 chipset has been posted here"

    It's the preview, not review.
  • I'm not sure whether to feel juiced about the fact that the nForce has USB 2.0, Ethernet, Firewire, TV Tuner (!!) and a bunch of PCI slots built in automatically, or unhappy that they've paired all of these great features with what amounts to a budget on-chip video card.
  • There is also a preview article at AnandTech.

    See here (one long page for printing) [anandtech.com] or here 8 pages [anandtech.com]

  • by chromakey ( 300498 ) on Tuesday July 16, 2002 @09:50AM (#3893628)
    "...the chipset contains two 10/100 ethernet controllers, six USB 2.0 ports, UltraATA133 support, three 1394 ports, five PCI slots, and an integrated GeForce4 MX core including NVIDIA's nView technology and a TV Tuner."

    What, no kitchen sink?

  • a partridge in a pear tree?

  • How much better can a chipset make a computer?

    When I bought my AMD thunderbird 1.0 gHz and skimped on the motherboard I have always been wondering if maybe I had cheated myself on performance.

    What do you think? Would a more expensive motherboard increase my gaming performance enough to justify the cost of a new motherboard?

    Parts list:
    AMD 1.0gHz Thunderbird
    256 MB DDR2100
    Nvidia GeForce2 MX400
    Soundblaster Live! XGamer
    • Seems you skimped a little too much on the motherboard, as it's not even on the parts list. Perhaps if you had the crucial device connecting your CPU, RAM, graphics and sound cards together, you might get a little performance out of them. ;-) but seriously... Not sure what bus speed the 1.0GHz Athlon does (I think it's 200MHz), but more than anything else, it's bus speed that matters - of course, you need a board that can support high bus speeds to maximise performance, which, if you have DDR RAM, you should have. You'll still be needing a CPU with 266MHz bus to maximise performance from your current setup, though. Could be time to get an Athlon 1.4GHz, at least.
  • What? No open AGPx4 (or x8) port for my ATI 8500? ;)
    • Umm... Didnt you read the article? I beleive it said there is an 8x AGP port as well...

      So you are not stuck with the MX. And with the other features, its not a bad set-up. As long as the integrated GFX is not set up to use any of your system ram regardless to wether its enabled or not.

      • I interpreted the article as the AGPx8 port would be integrated and used by the onboard graphics chipset. I don't think it will be coming with an open AGP slot, but then again it's just a motherboard spec so I guess it can be modified by other manufacturers. Of course, what would be the point...hence my joke..
    • Sure there is, just get the version without the integrated video. Actually I built a system with an nforce1 that used a GF3 Ti200 even though the board had integrated video, the board without integrated video was like $3 more and the integrated unit had an AGP port so it made no difference. Dual channel DDR, actually good integrated audio, and all the other connectors will make it a decent board even if you don't want the integrated video.
  • by AppyPappy ( 64817 ) on Tuesday July 16, 2002 @10:09AM (#3893797)
    The chipset contains 5 PCI slots? I'm impressed. I had to buy a motherboard to get that.
  • Is it just me (Score:2, Interesting)

    by Raghead ( 167999 )
    Or does anybody else think these are just press releases, not reviews, as listed??
  • by Chris Carollo ( 251937 ) on Tuesday July 16, 2002 @10:19AM (#3893902)
    nVidia's marketing department should be ashamed; the name of this piece of hardware is blatantly misleading. Every other "mx" version of their cards contained the same featureset of it's GeForceX line, but had slower/less memory.

    The GeForce4mx, on the other hand, is missing the priciple feature of the GeForce3, that being hardware vertex and pixel shader support. The GeForce4mx is basically a really fast GeForce2. It's a sham.

    It screws developers (no longer can we say "GeForce3 and up", we have to qualify by specifically excluding the GeForce4mx). It screws customers by making them think they get a better card than they are. It's just bad all around.

    When I talked to an nVidia rep at this year's GDC he acknowledged it's hatefulness and gave the impression that it would be going away shortly. Given the number of these cards I see in stores and this announcement, I'm starting to doubt him.

    Note to nVidia: when your marketing department starts screwing developers and customers, we developers stop wanting to support your cards. You've been at the head of the pack for a while now. Crap like this isn't how to stay there.
    • Agreed. The GeForce 4MX name confuses users. It confuses developers, too. Does it mean that the hardware vertex and pixel shader support in all GeForce 3 and non-MX GeForce 4 isn't to be considered "mainstream"? Should developers starting on a new title assume their target machines will have that hardware? The Chameleon demo is gorgeous, but not much else really uses that hardware yet.

      The Geforce 3/4 shader engine has a huge transistor count, around 57 million, more than many CPU chips. But the GeForce 2 and the Nforce family are much simpler parts. There's a real cost for that hardware. The NForce was supposed to be a low-end product. So it's not surprising that it came with GEForce 2 capabilities. But many developers were hoping that the old GeForce 2 architecture would be phased out in the next round. That's not happening yet.

      NVidia's parts are fine; it's just their name confusion that's a pain. From a developer perspective, it means you can't just put "Requires GeForce 3 or better" on the box in big type.

    • nVidia's marketing department should be ashamed

      yes they should, but shouldn't anyone who is in marketing? ;)

  • The current nForce is available in small form factor mobos like the Abit NV7M. I'd be very interested in a tiny version of nForce2.

    If you plan to use TwinBank (6GB/sec system bandwidth!) you only need/want two DIMM slots. With video, audio, network, firewire, usb2, etc, all built-in, you hardly need the PCI slots at all.

    Fewer components should also mean lower power consumption, which means fewer/slower fans, which means blissful quiet computing. Hopefully.

    Also, any word on the rumored Shuttle SS41 yet?
  • Tom's Hardware (Score:2, Insightful)

    by AnimalSnf ( 149118 )
    Is it me or are the lackies being hired at Tom's are getting dummer and dummer with time? I don't have time to run down the entire list of inaccuracies and errors in the article, but according to them DDR400 "corresponds to a performance level that SDRAM could only achieve at 400 MHz," and best of all, Nvidia was "Founded in 1997 by a handful of ex-SGI employees."
    • Re:Tom's Hardware (Score:3, Interesting)

      by Pulzar ( 81031 )
      DDR400 "corresponds to a performance level that SDRAM could only achieve at 400 MHz,"

      Why is this inaccurate? DDR400 is DDR running at 200MHz, which is equivalent to SDR running at 400MHz.
  • Anandtech's article [anandtech.com] This one is much more than just a "breif" overview...it is meaty :)
  • by Andy Dodd ( 701 )
    Does anyone know if the BIOSes for the nForce (and probably the nForce2 if the nForce has it) have PXE support for network booting?

    ANY provision for network booting?

    netbooting would make these into killer thin-client motherboards...
  • ...except for one little consideration. NVidia has not released full drivers for Linux for the nForce. Are they going to be any better for nForce 2?

    I am definitely looking at an nForce 2 based solution to upgrade a Windozer of mine, but this would be a splendid solution for Linux if they had the drivers for it. I hope NVidia gets on the ball this time.
  • Now, if somebody like Jumptec [jumptecadastra.com], Ampro [ampro.com], or any of the other embedded CPU board makers would use this! I'd love to have that for my [p25.com] embedded system - fast graphics for all the traces, USB 2.0 for RF control, two Ethernet ports for access...

    I wonder if anyone could pursade nVidia to put one of these [intersil.com] in there... They have everything else....
  • the MX series of GeForce 4 GPUs are woefully underpowered, wait for the next revision of the motherboard that has a non MX GPU. This is such a scam and so many people are getting burned because they see a cheap Geforce4 card and think it is a great deal, when it is just a fast GeForce 2, spread the word!
  • by Namarrgon ( 105036 ) on Tuesday July 16, 2002 @01:49PM (#3895748) Homepage
    ...and a TV Tuner.

    This is incorrect. The chipset includes a TV Encoder, i.e. supports "TV Out" - S-Video or composite out to a TV. From the press release [nvidia.com]:

    NVIDIA nForce2 Platform Processors offer a staggering array of features including:

    * TV-encoder and HDTV processor for optimal visual quality

    It does not include a TV Tuner capable of receiving broadcast TV. You'll have to add one yourself.

    BTW, if you're wondering, the HDTV processor simply means it is capable of decoding HDTV-format MPEG2 video. You would still need an HDTV tuner/receiver to get the signal first.

  • The EETimes article [eetimes.com] covers a lot of the same ground. Some info from it:

    Since Nvidia doesn't have a license to develop for the intel bus, this will interface to AMD processors (uh, despite that the xbox is intel-based). A version for the Hammer is "far along" and may merge north and south bridge functions into one chip.

    Four Taiwanese motherboard manufacturers, including Asus and Chaintech, will use the chips

    A future version for server line cards may include gigabit ethernet, routing capability, and a HyperTransport link to network processors.

  • Legal Linux drivers? (Score:2, Interesting)

    by Mike_L ( 4266 )
    I'm still waiting for Linux drivers that can legally be linked into the kernel. nVidia forces me infringe on the GPL when I use their nvnet driver. It would also be nice to have the nforce drivers in the kernel distribution.

    -Mike_L

What the gods would destroy they first submit to an IEEE standards committee.

Working...