Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Hardware Games

Nvidia Unveils New Mid-Range GeForce Graphics Card 158

crookedvulture writes "Nvidia has uncorked another mid-range graphics card, the GeForce GTX 560 Ti. Every tech site on the web seems to have coverage of this new $250 offering, and The Tech Report's review will tell you all you need to know about the various flavors available, including how their performance compares to cards from 2-3 years ago. Interestingly, the review concludes that pretty much any modern mid-range graphics card offers smooth frame rates while playing the latest games at the common desktop resolution of 1920x1080. You may want to pay closer attention to power consumption and noise levels when selecting a new card."
This discussion has been archived. No new comments can be posted.

Nvidia Unveils New Mid-Range GeForce Graphics Card

Comments Filter:
  • Mid-range? (Score:5, Insightful)

    by XanC ( 644172 ) on Tuesday January 25, 2011 @06:36PM (#35001194)

    Somebody dropping two hundred and fifty big ones on a video card is mid-range?

    • by Anonymous Coward on Tuesday January 25, 2011 @06:47PM (#35001280)

      Somebody dropping two hundred and fifty big ones on a video card is mid-range?

      Yes, $50 for the card and $200 for the monster cable.

    • Yeah WTF. An M5 isn't midrange just because you can buy 83% lean for way less than a dry-aged filet.

    • Re:Mid-range? (Score:5, Insightful)

      by eepok ( 545733 ) on Tuesday January 25, 2011 @06:48PM (#35001290) Homepage

      Precisely my thought.

      Budget: Free/Hand-me-down to $75
      Mid-Range: $76-$150
      Enthusiast: $151-$250
      Takes gaming too seriously: $251+

      • by vux984 ( 928602 )

        Its "mid Range" in Nvidia's line of cards. In that its not near the bottom, and its not near the top. I think its fair.

        When you look at 'mid range' from the perspective of the buyer I think you are more or less right. But the upper "mid range" product is where enthusiasts with brains AND money tend to buy in at.

      • I was thinking almost the same thing. "Midrange" to me is $150-ish... and also what I consider the sweet spot for video card purchase. For that price, usually you're getting a good quality implementation of a die shrink of last year's GPU. Lower power consumption, better driver stability than bleeding edge, no redonculous(not a typo) heatsink/fan, and no need to throw your power supply out the window because it has a 6-pin connector instead of 8.
      • And in 6 months, all cards move one level down.

        I just upgraded from on-board video and got a GeForce GT240. Even if my rebate doesn't go through, it falls in your Budget category.
        Then again, I run a quad-core CPU running 64-bit Linux, and am not a hardcore gamer. But it's been great. I'll probably never drop more than $100 on a video card, it just doesn't make sense to me. I think a lot of people get caught up in the "latest and greatest" frenzy, and some people truly are hardcore gamers (those that are

      • If you happen to purchase a 30" monitor with 2560x1600 resolution, you pretty much need a $251+ video card for smooth games. Granted you can get away with a lesser video card, but cheaping out by $200 when your monitor costs $1000+ seems silly.
    • Re:Mid-range? (Score:5, Insightful)

      by TWX ( 665546 ) on Tuesday January 25, 2011 @06:49PM (#35001300)

      Video cards seem to be the one aspect of computers that doesn't follow both Moore's Law and the cost reduction model that we've seen elsewhere. It would appear that for most computer components and systems, over time power increases and costs drop. In the case of video cards though, prices seem to have been stable or on the increase for the various classes of components at a given point. When my first-generation 3dFX card was top-of-the-line-consumer class it was less than $200 if memory serves. My (at the time) high end Matrox G-series dual head card was about the same price or maybe a little more expensive. Modern ATI and nVidia products seem to be more expensive compared to what the previous cards were introduced at.

      I guess that the cost to game is why I got out of most computer gaming. I found myself with less and less time to play, and it's hard to justify $300 for an expansion card when I'll use it twice a month and when it'll be "obsolete" in six. Ditto for the games themselves, when they're $50 each it's hard to play more than one with such a small amount of time. I get a lot more value for my money buying games at a books/media store that buys the remnants that didn't sell originally a year ago and sells them for $10 a title or less, plus they work on hardware I already have.

      • Maybe you should read up on what Moore's Law actually is.
      • Re:Mid-range? (Score:5, Informative)

        by Fulcrum of Evil ( 560260 ) on Tuesday January 25, 2011 @08:14PM (#35002268)

        Video cards seem to be the one aspect of computers that doesn't follow both Moore's Law and the cost reduction model that we've seen elsewhere.

        How do you mean? Moore's law is all about transistor density - the fact that Nvidia maintains specific price points and varies performance to compete is irrelevant.

        • by TWX ( 665546 )

          In a broader sense Moore's law has been applied to computing power doubling every eighteen months. Yes, specifically it's it's transistor density.

        • Video cards seem to be the one aspect of computers that doesn't follow both Moore's Law and the cost reduction model that we've seen elsewhere.

          How do you mean? Moore's law is all about transistor density - the fact that Nvidia maintains specific price points and varies performance to compete is irrelevant.

          Actually, Moore's law focuses more on the economics of chip making. Because chips become cheaper to make over time, manufactures are able to double the transistor density every 18 months without increasing the cost.

          The Moore's law states that the increased transistor density is a side effect of cheaper manufacturing processes, not the other way around.

          • by tlhIngan ( 30335 )

            Actually, Moore's law focuses more on the economics of chip making. Because chips become cheaper to make over time, manufactures are able to double the transistor density every 18 months without increasing the cost.

            The Moore's law states that the increased transistor density is a side effect of cheaper manufacturing processes, not the other way around.

            Correct. But CPUs, GPUs, and chipsets, which are full of random logic are not the stunning examples of transistor density. In fact, what limits the transistor

      • I get a lot more value for my money buying games at a books/media store that buys the remnants that didn't sell originally a year ago and sells them for $10 a title or less

        In a lot of cases, these games are in the bargain bin precisely because 1. the publisher has pulled the plug on the online multiplayer matchmaking servers, and 2. the game offers no local multiplayer (shared-screen or spawn installation) option.

      • I think it's because video cards are becoming more like whole computer systems in themselves. More and more general purpose computing features and such. Just recently I have been playing with GPU development and I have to say that for certain tasks it's quite impressive.

        Really $250 (GTX560) or $350 (GTX570) is not out of line for what you pay for a mid-range CPU, it makes sense that the video card is in the same ballpark.

        But like everything I do wish they were cheaper.

        • by Machtyn ( 759119 )
          For me, those are the high-end range of things. For processors, I recently picked up a Gigabyte motherboard and an Athlon X4 CPU for $300 total. (An equivalent Intel purchase would have increased the motherboard price by $100-$150 and the CPU price by around $100.)

          Mid-range would be the GT4xx and GeForce 88xx or 98xx series and the low-end would be the GT2xx and GeForce 86xx or 96xx series.

          I would try to list ATI equivalents, but I've never been a fan of them... and now that I'm in a Linux world, I t
    • Re:Mid-range? (Score:4, Interesting)

      by Tumbleweed ( 3706 ) * on Tuesday January 25, 2011 @07:01PM (#35001430)

      Somebody dropping two hundred and fifty big ones on a video card is mid-range?

      I see this reaction a lot in people who don't know the market. Ignorance of what the low and high ends of the 'range' wind up surprising people. If you're ignorant of the numbers 1 through 10, someone randomly reciting the number 5 might seem high to you. In video cards, there are $350+ cards, and even $500+ cards, in the consumer space. And that's just PER CARD, and doesn't take into account multi-card setups.

      So yeah, $250 is a MID-range card. That's not to say it does (or doesn't) meet your specific needs, but expressing shock at something you're obviously ignorant of really doesn't make you sound like a smart consumer.

      • Re:Mid-range? (Score:5, Informative)

        by obarthelemy ( 160321 ) on Tuesday January 25, 2011 @07:07PM (#35001500)

        Depends how you define mid-range. Steam has a nice breakdown of actual graphics cards used to play their games: http://store.steampowered.com/hwsurvey/ [steampowered.com] Keep in mind that these stats are for players, the actual market is much more low-end that that.

        So $250 would be about in the top 5% of the gamers' market, 1% general market ?

        • Depends how you define mid-range

          I define it by the prices involved, since that's what people are talking about in the first place - the price of the card. You'll always find fewer users at the high end of any price range.

          • So it's low end then?

            Given a Quadro 6000 from nvidia will set you back between $3000 and $4000, the a mid-range card must be $1500-$2000, right?

            • So it's low end then?

              Given a Quadro 6000 from nvidia will set you back between $3000 and $4000, the a mid-range card must be $1500-$2000, right?

              That's not a consumer graphics card, though. :)

              • That's not a consumer graphics card, though.

                But you have now gone from saying that mid-range is not defined by the consumer to saying that the top of the range is defined by the consumer.

              • by smash ( 1351 )

                No, you can't just pick a mid point between zero and max cost. Hardware gets more expensive for diminishing returns at the high end.

                Its quite likely you'll get 90% of the performance of a $2000 card for half the price. Mid-range performance wise is much cheaper.

                Otherwise we'd be calling $800,000 cars "mid-range" as they're halfway between zero and a veyron's price tag.

        • Re:Mid-range? (Score:5, Informative)

          by wagnerrp ( 1305589 ) on Tuesday January 25, 2011 @09:01PM (#35002568)

          You should look at the chart again. The top two cards of each graphics series is going to be in the $200 and up range when purchased, so tallying those up from the December survey, you get somewhere around 45% of the users. Significantly higher than the 5% you seem to have pulled out of nowhere.

          Now what is the general market? The people who are going to buy their own graphics cards are going to be professionals doing 3D or computational work, gamers, and HTPC builders. Everyone else is going to stick to their integrated Intel graphics and be none the wiser. The HTPC market is going to buy all low end stuff, the professional market is going to buy primarily high end stuff, and the gamer market, according to that survey, seems to be right in the middle of that price range. For people who actually would buy a video card, which is the only market that matters to video card manufacturers, $250 indeed does seem to be mid-range.

          • by Machtyn ( 759119 )
            Actually, the HTPC market is starting to get some decent integrated options from ATI and nVidia. These onboard chips are designed for TV connections and HD content (HDMI connectors, software to decode 1080p natively, etc). These chips, as I've found out, still aren't good enough for playing World of Warcraft at great graphics resolutions (good, yes, not great).
      • by eepok ( 545733 )

        Mid-Range typically refers to what the majority are willing to spend on something, not what prices are offered. The buyers, not the sellers, determine mid-range and the buyers aren't scrambling to grab $250 cards.

        • Mid-Range typically refers to what the majority are willing to spend on something, not what prices are offered. The buyers, not the sellers, determine mid-range and the buyers aren't scrambling to grab $250 cards.

          People don't generally buy video cards. They buy computers with (or without) video cards. For people who actually buy video cards, $250 is mid-range.

          • by jedidiah ( 1196 )

            I'm not sure you can assume that just because someone is buying a video card that it will necessarily be the most expensive thing possible.

            Not everyone is trying to play the latest shooter at the highest resolution and frame rate possible. Not even people building or upgrading their own boxes.

            • I'm not sure you can assume that just because someone is buying a video card that it will necessarily be the most expensive thing possible.

              Good thing that's not remotely what I said. We're talking about 'mid-range' here, after all. A $250 card is not a $499 card, and certainly isn't a $599 card. It's much closer to the $150 everyone here seems to WANT to be the mid-range, simply because they're cheap, I guess. But wishing doesn't make it so. I've used $150 cards for the last several generations myself, but

              • An ATI 5870 isnt midrange? What about a 6850? Are they low-end?

                Last-gen doesnt mean "garbage" or even "low-end"; a coworker bought a 7900GTS about 2 years ago and it would still probably rank as "low-midrange" today.

            • by tepples ( 727027 )

              Not everyone is trying to play the latest shooter at the highest resolution and frame rate possible.

              And a lot of these people are happy with Intel onboard "Graphics My Arse".

      • It's mid range for high end gamers. However it is not mid range for average consumers! Do not compare to your peers, or you'll get a misleading number. Someone who drives a Lexus might have an inflated notion of what a mid range automobile is too.

        • "Mid-range automobile"? What possible use would it be to consider that? When there are multiple manufacturers with multiple "ranges" and multiple classes of cars from $2m supercars right down to $18,000 economy cars. If instead you asked a sensible question like: "What is the mid-range Ford saloon-car?" the Lexus owner would look at the most expensive model, then the least expensive, and tell you that the mid-range Ford saloon-car offering was the model closest to the mid-point.

          The new "mid-range" GeForc
    • No. The MSRP is $250. This means they will actually sell for $200, which is midrange (I say $100-$200 is midrange).

    • Re:Mid-range? (Score:4, Insightful)

      by Surt ( 22457 ) on Tuesday January 25, 2011 @08:03PM (#35002144) Homepage Journal

      Yes, it is in the middle of the range from people who spend $0 additional for the on the on-board graphics of their motherboard/cpu, and the people who spend $500 for a top-of-the-line card. Mid-range, exactly fitting the definition.

      • by shish ( 588640 )
        By that logic, walking is free, a private jet can be up to $50mil, so a mid-range form of personal transport should be around $25mil
        • Aw c'mon, you're not even trying.

          Flapping your arms is free, a high-end private jet can be about $50M, so a mid-range personal jet should be about $25M.

          According to this list of prices, [aviationexplorer.com] the Gulfstream G550 has a MSRP of about $46M, and a "mid-range" Cessna Citation is in the $15M-$25M range. How about that. Note, this doesn't imply that a Ford Focus should cost $25M. While the Focus is "transportation," it's lacking wings. You been hanging out with BadAnalogyGuy?
          • by shish ( 588640 )

            While the Focus is "transportation," it's lacking wings

            So? I'm not in the market for a plane, I'm in the market for some form of personal transport

    • Also surprised that they say 1920x1080 resolution is "common".

      Basically a year or two ago I got the best card I could get that didn't require an extra fan or upgraded power supply. It actually did better than my older one that was a loud space heater. But I was somewhat discouraged to find that it was about the only one of it's kind, every other card on the shelf recommended more watts than my tower supplied and came with an integrated fan.

      • by smash ( 1351 )

        1920x1080 is common if you're in the market for new hardware. i.e., if you're building a new box with new monitor, etc. Its also the lowest res you've been able to buy in an iMac for some time now.

        Given that the monitors we've been getting with our Dells lately have been 1920x1080 and cost about 250 bucks, it isn't going to break the bank.

        1920x1080 is HDTV res, and as more people are doing things like processing high def video on their PCs, it is very likely to become the "Standard" resolution on any

        • by wisty ( 1335733 )

          Also, until Windows gets resolution independent graphics, 1920 X 1080 is about as high as you want to get for a 22 inch monitor. Any higher, and I won't be able to see the graphics. Retina has 326 dpi to have the same resolution of a normal eye at 12 inches. At twice the viewing distance, you would need just 1/4 the resolution - 81.5 dpi. Any more, and you are paying good money just to make your icons smaller.

          Besides, my Intel graphics can't render much more.

    • >>Somebody dropping two hundred and fifty big ones on a video card is mid-range?

      High end cards hover around $500, and get 33% to 100% more performance than the mid-range cards at $250, who get the same performance edge over the low-end cards around $125, which blow the hell out of the performance of entry level or integrated graphics.

      The new generation is no exception. The 580 is intriguing to me, but the 560 Ti (especially overclocked) looks like it has the best combination of price, performance, tem

  • by intellitech ( 1912116 ) * on Tuesday January 25, 2011 @06:37PM (#35001198)
    Jeez, I feel old.
    • it's *because* I'm old I bought the 1920x1080 graphics card (used, $70) and 23" widescreen monitor (refurbished, $100), for the big fonts yet two pages on a screen. I might even get a 2nd monitor this is sooo nice
    • by TWX ( 665546 ) on Tuesday January 25, 2011 @06:59PM (#35001414)

      Yeah, I hear you. I was used to 1280x1024 or 1600x1200, so these 16:9 or 16:10 aspect ratios take some getting used to.

      What really irks me, though, is a seeming lack of development for inexpensive high-res monitors that go beyond "1080p". My current display is a 20" 4:3 ratio 1600x1200 unit, and if I wanted to go bigger I'd want more than 1080 rows. I sort of understand the complaints that audiophiles had back in the eighties with the Red Book CD standard and being limited to 44KHz 16 bit audio and no functional implementation of more than stereo audio. Before that they enjoyed quadraphonic sound in whatever quality the analog recording equipment and playback equipment could achieve, and while lower end equipment and poor media maintenance might have led to results less than 44KHz 16 bit, high end stuff and good practices would have yielded much better sound. By releasing Compact Disc as the high end system and later as the de facto standard for everyone they cut off the ability to get more.

      • by Ant P. ( 974313 )

        Smartphones push 200, 300, 400dpi already so it's not like 96dpi is even a hard limit. I'd be willing to pay the premium for a 300dpi desktop screen with insanely high resolution, but nobody wants my money apparently...

      • What really irks me, though, is a seeming lack of development for inexpensive high-res monitors that go beyond "1080p". My current display is a 20" 4:3 ratio 1600x1200 unit, and if I wanted to go bigger I'd want more than 1080 rows.

        I had the same dilemma a few years ago and decided to get a 2560x1600 monitor. They cost a bit more, but given the lack of progress, it'll still be high tech ten years from now. Although some 2160p TVs have been demo'd I'd call it pretty unlikely you can get those at a decent price within a decade.

        • Same here. I had a 20" 1600x1200, and when it went bad, I went to a 24" 1920(or some such)x1200. When it went bad, I couldn't find any locally available 16:10 screens, so I went with a 16:9. It annoyed the heck out of me, and I immediately bought a 2560x1600 30" even though it was $1200 or so with warantee. The only problem is you have to have a sufficiently expensive system and video card to drive such a display smoothly.
      • One is as you say the de facto standard thing. The top ATSC rez is 1080, so that is what a lot targets. However another part is just money. It is expensive to pack more transistors in a small space and that's what you need for higher rez monitors. People are pretty price sensitive so the market would be kinda small, meaning the unit price goes up meaning the market is even smaller. Another is interconnect bandwidth. Single link DVI and by extension older HDMI only supports up to 1920x1200@60Hz. That's just

        • by adolf ( 21054 )

          So, let's be in the future, already.

          I've had a 15.4" 1920x1200 display on my Dell Inspiron laptop for six and a half years. Scaling problems? I haven't seen any in a long, long time -- even XP was behaving pretty well in that regard when I last used it.

          7, as you say, is flawless and I've had precisely zero issues with that end of things: It even tends to set things up with reasonable scaling, based on actual display DPI automatically, out-of-the-box, while also automagically configuring things at native

      • I'm a big fan of vertical pixels too. One of the things I did with my old Dell 2407 was turn it 90 degrees. The rotated 1980x1200 screen is perfect for web browsing, gmail, and other 'tall' layout apps.

        The 'cheap' panels are 16:9 form factor - you see the 1080p stuff everywhere because it costs nothing. Think I paid around $130 for a 22" 1080p monitor that *just* fits inside a carry on suitcase. Those can be rotated as well. (Tis a crime you can hardly find a laptop not using a 16:9 aspect - I really li

      • Not to mention the lack of decent, affordable IPS LCD panels. Thanks to their narrow viewing angles, TN panels have a significant stereographic effect that is so horrible, they give me eyestrain worse than any tube I've ever used. Even the crappiest LCD TV is better than a high-end computer monitor.

        I find it distressing how many people will spend $300 for a new video card every two years, but then they spend several years using some $150 LCD they bought on sale. My dad, for example.

      • Err, not really sure what would have been better audio as CDs used the best quality recordings available at the time which was DAT, which is where the limits come from. I'm not entirely sure that there were any better analogue recordings either, as they all have other issues such as noise which can be a very big issue.

        Also I'm not convinced you can actually hear anything about 22.05 kHz so 44.1kHz is a reasonable compromise, especially as this was at the very limit of what was possible 30 years ago.

      • Additionally, HD monitors have killed-off the 4:3 aspect-ratio displays. I manufacture OEM equipment, and there are sooooo many situations where the 4:3 monitor was superior. I can't buy new ones now to save my life.
    • Assuming you bargain hunt they aren't that expensive. But they do require some outlay of cash, I think mine was under $300 and serves as an HDTV as well. It wasn't that long ago that 15" LCDs were going for $300+
  • by bhcompy ( 1877290 ) on Tuesday January 25, 2011 @07:18PM (#35001640)
    5770 is a nice midrange card. Plays everything well, mostly with high settings. 140$ is a nice price for a 5770 w/ 1GB GDDR5. For nearly the same price as the one mentioned here you're in CrossfireX with more power behind it
    • Uh sure, or you could get a $125 nVidia GTX 460 and completely destroy that 5770 in terms of performance and features. Plus you get a lot better drivers.

      • I went from an nVidia card to the 5770 purely because I didn't want to heat up my computer room just by having the computer on. (I also went for a low powered CPU for the same reason).

        I live in Australia and which gets rather hot around here. I found that it doesn't matter how much better the performance is of one card if it makes the room so oppressively hot that I just don't want to use the computer in the first place. With my current setup, I can use the PC on the hottest day and still have the room only

      • Completely destroy as in marginally better. And a GTX 460 1gb averages closer to 200 than 125. Cheapest on newegg is 170 on special, average price for the wide range of manufacturers is 199(sorry, I'll stay away from Powercolor and HIS, thank you).

        And driver problems? This isn't 2002.
        • by smash ( 1351 )
          No, its 2011 and ATi's drivers are still shit, especially under Linux. You'd think they'd have fixed them in the past 9 years, but alas...
          • and you're not playing games under linux that is going to max out either card, so why spend more for something thats going to suck up more energy and give you potential marginal gains?
  • by CAIMLAS ( 41445 ) on Tuesday January 25, 2011 @07:23PM (#35001686)

    It would appear that, based on power use and the performance of various chips, that the CPUs days of being the power hog and performance workhorse of the common desktop are over. Anything which today needs high-end CPU can (or at least, should) be able to utilize the GPU on the card as well - and to greater effect.

    At the same time, We're seeing similar power use increases in our GPUs today that we did 8-10 years ago with CPUs. Performance is increasing, but power input is, as well. 40db for a graphics card is quite a bit, as is 230+ watts (ohmygod, that's more than my entire system while playing a game).

    I wonder how long it'll be until we see the same kind of power performance improvements in GPU design as we saw in CPU design a couple years ago.

    All said, it's quite a contrast from the 700Mhz celeron I still have cooking away with the 'whole system' power envelope at about 25 watts (PSU is only 35 watts), and have for the past 8 years. No, it's not gaming, but it's doing quite a lot just the same.

    • Re: (Score:2, Interesting)

      by eepok ( 545733 )

      About a year and a half ago, I upgraded my system to a cheapo off a w00t!-off for ~$300. It came with a decent dual-core processor, DVD-RW, 750GB HDD, onboard sound, onboard video, 6GB RAM, and a free upgrade to Win7 from the pre-installed Vista. It also came with a a 270w power supply. Being a budget gamer and someone always open to another computer challenge, looked immediately into making a low-wattage system that could play games like L4D2 and the aging but still-insanely-resource-hungry Everquest.

      After

    • by Elbereth ( 58257 )

      How the fuck did you get that system to even power up with only 35W? RAM itself can use up most of that, unless you're using ancient PC100 RAM, and only 64MB of it.

      Holy crap.

      • Must be coppermine.

      • by CAIMLAS ( 41445 )

        Not terribly sure, to be honest with you. The PSU only has a 35 watt rating, so I'm not bending things too much; it's been a while since I tested it at the wall, but I find it hard to believe it'd be much more than that.

        It's one of these: http://www.accurateit.com/images/items/compaq_ipaq.jpg

        It's got the original 700Mhz CPU (board wouldn't boot with a 900Mhz replacement), a single fan in the PSU (quiet), an 80G disk, and 386Mb of RAM. I'll be very sad when it finally kicks off to the great divide - I no lon

  • Additional benchmarks in another review over at HotHardware: http://hothardware.com/Reviews/NVIDIA-GeForce-GTX-560-Ti-Debut-MSI/ [hothardware.com]

  • >You may want to pay closer attention to power consumption and noise levels when selecting a new card."

    Hells no!! If the card doesn't make the room lights dim when I start up Crysis, and the back of the computer doesn't feel like a blowdryer (and sound like one), it's not fast enough!!!!111oneoneoneone *pant pant pant*

  • GTX 470 is slower, but has 448 cores, and a 302-bit memory interface. Does more speed of the GTX 560 Ti make up for less cores and slower memory interface? I'm interested in experimenting with OpenCLI and getting three of these in SLI for some GPU raytrace rendering - something the 470 lends itself to pretty well. 560 seems like a few steps forward and a few steps backwards - hard to say if it's worth getting over the 470, unless I have grossly missed something.

  • I got sick and tired of spending nearly double the price to get games on a console so I decided to get a mid range gaming PC. Looking over the math again, $1500 probably could have bought a lot of console games instead.
    • by cfalcon ( 779563 )

      If you got time to play that many console games, more power to you.

    • I'm not sure exactly what you bought, but in the last month I bought an Asus P8P67 mobo + Intel 2500K 4 core processor for $300 (runs at 4.5GHz on demand using Asus' built in overclocking), 8GB ram for $80, CPU Heatsink $30, computer case for $100, and a Geforce 580 for $500. I saved some money by using the OS and HD from my old system. But I would call this more than mid range, and less than $1500 (or at least comparable if buying Win7 and a HD)

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...