Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Launches Budget Processor Refresh 209

MojoKid writes "AMD has again launched a bevy of new processors targeted squarely at budget-conscious consumers. Though Intel may be leading the market handily in the high-performance arena, AMD still provides a competitive offering from a price/performance perspective for the mainstream. HotHardware has a performance quick-take of the new 3.2GHz Phenom II X2 555 and 2.9GHz Athlon II X4 635. For $100 or less, bang for the buck with AMD is still relatively high."
This discussion has been archived. No new comments can be posted.

AMD Launches Budget Processor Refresh

Comments Filter:
  • I agree... (Score:5, Informative)

    by Yaa 101 ( 664725 ) on Tuesday January 26, 2010 @07:23PM (#30912062) Journal

    I agree, I have a Phenom x2 and my whole system cost me a mere €300, - including sound, HDD and good enough video to have a 3d gnome desktop.

    • The standard whitegood market, where everything is cheap and disposable.

      The standard pleb doesn't really give a damn whether it can crunch a billion petaflops in under a nanosecond, or heat a cup of water standing on the desk by its sheer awesomeness.

      All they care about is whether they can chat to their friends, write a letter, browse the intert00bs and lose the last bit of their privacy by posting everything on facebook.
      • by nyctopterus ( 717502 ) on Wednesday January 27, 2010 @04:20AM (#30914794) Homepage

        Wow, tell us more about the stupid sheeple, oh insightful one!

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        Actually, the new AMD's are also very valid options if you are interested in bang for the watt. 65 W TDP for a pretty fast quad core is awesome if you're doing number crunching.

      • Re: (Score:3, Insightful)

        I can't say anything about your comments on the use patterns of the "standard pleb" but I do quite a lot of structural engineering work, which involves extensively using commercial structural analysis software on a daily basis and even developing my own programs, and I do all that on a "cheap and disposable" Athlon X2 4000+ system which cost me around 250 euros three years ago.

        The thing is, you may have far more powerful CPUs in the market but the truth is that, although they can cost huge amounts of money,

      • FLOPS are a measurement of rate, so to describe something as performing a certain number of flops in under a second, implies that the processor is accelerating.
    • It's fast enough for any usual non-gaming usage...and also for most games, if you're fine with mostly ignoring latest gen ones (and really, with so many great older ones that's easy). Plus it is consistantly passively cooled.

      • by jandrese ( 485 )
        Hell most of the "latest gen" games are just console ports anyway, and run quite well on PCs that are 5 years old. Sometimes the console port is of especially poor quality and requires a beefy CPU (Grand Theft Auto for example), but even old 2.4Ghz Core2Duos are well above what you need for most modern games.
        • Re: (Score:3, Insightful)

          by TikiTDO ( 759782 )

          Also, if the majority of the public has these slower CPUs, what sane game maker is going to make games that do not at least run on these machines? That sounds like a good way to lose 90% of your profit.

          • Re: (Score:2, Interesting)

            by nomessages ( 1160509 )
            Futuremark, at least...
          • by Sycraft-fu ( 314770 ) on Tuesday January 26, 2010 @11:25PM (#30913538)

            Games almost never require high end systems. There are a few that come along that won't run on anything less than the latest greatest but it is extremely rare. Most games will run on mid rangish hardware, and not have a problem with things a couple generations out of date. They won't let you max all the detail in that case, but they'll run just fine.

            Most people do not have high end systems. Many systems are older, after all not everyone upgrades all the time, and even when they do they often don't buy the high end parts. As such game makers support that. They usually also have higher detail settings for people with higher end systems, since those people often also spend more money, but they don't usually cut out the more mid range market.

            Right now most games run quite well on a dual core in the 2GHz+ range with a $100ish current graphics card or a $200ish older graphics card. By well I mean with details turned up a reasonable amount and smooth gameplay.

            • Details on medium-low, and on lower resolution (1280 by 1024 or 1440 by 900 seems to be the limit spot, 800 by 600 is better, and usually 1680 by 1050 not very good. Higher resolutions need not apply).
                    Usually "smooth gameplay" means decent minimum frame rates

              • I get good frame rate on 16x10 (>40fps) and decent frame rate on 19x12 (~30fps) with 5 year old Athlon 64 X2 4200+ (2x2.2GHz) and Radeon 4850. Recent games I have played were Dragon Age and Torchlight - and all settings were at least on medium, most on high.

                IMO, level of details in most games already exceeds by a huge margin whatever normal human being can perceive. Pushing for more details is pointless.

                • "High details" are usually shown in super-zoomed images in games reviews.
                        30 fps is absolutely enough in minimum frame rate (depends on game, though - less might be enough in some titles).

            • Lol. Depends on what you call “run”.

              If you call 18 fps with medium graphics settings an medium resolution “running”, then yes.

              But if you want your game to not look like the previous generation, or even worse, you PC has to be new too.

              • by siDDis ( 961791 )

                I play Call of Duty Modern Warfare 2 just fine with a Geforce 8800 GTS 320MB

                That is high quality settings and 1920x1200 resolution

                Framerate? Dont know, its smooth enough which should be around 30-40 frames per second on average.

                Spending more than $100 on a graphics card today is just waste of money. Except if you would like to play a game like Crysis on Linux where you need the extra power to get decent performance. Then again,

        • by sznupi ( 719324 ) on Tuesday January 26, 2010 @08:59PM (#30912782) Homepage

          I would also be glad to see the term "console port" go away. It's nonsensical, implies there was some amount of "porting" being done...while that's not really true nowadays, not after efforts of MS. Same dev tools, same team, same engine, similar art assets; there's no porting taking place, only two parallel and largely common efforts. Not exploiting the strenghts of both platforms (do you think console side of such game is really optimised for hardware?)

          But the term must be convenient for publishers, with players pointing fingers at those "evil consoles" instead of pointing them at...publishers.

          • by jandrese ( 485 ) <kensama@vt.edu> on Tuesday January 26, 2010 @11:44PM (#30913658) Homepage Journal
            This sort of thinking creates PC games that tell the player to hit the "X" button to do something, only it actually means the left mouse button because it was X on the console. It also results in FPS games with horrible console-like auto-aim on devices using a mouse and keyboard, and games that needlessly reuse keys because the original controller was too limited for all of the functions the devs wanted to do.

            Console ports require more thought than "recompile with a different target".
            • And no main menu option for "quit". And an uncontrollable third-person camera. And a game mechanic only workable on an analogue movement controller. and so on, and so on.

          • Um....no.

            First, as other posters have pointed out, there are other consoles besides the XBox.

            Second, we do have different processor architectures between the XBox 360 and the PC. Now, if your coders are competent that won't be a problem, but at a game company you'll likely find some relatively young guy who's absolutely sure that his assembly is faster than what the C compiler can produce so he codes something in ASM. Or he assumes pointers are 4-bytes. Or that numbers are big-endian*. Or any one of the

            • by sznupi ( 719324 )

              But while saying this you should have in mind, say, PC version of Quake and port of it on PS1...

              Wii is largery out of scope here, it's very different, with large portion of exclusive games (which also require vastly different coding practices) and those games aren't the ones people think about when saying "console port". PS3 seems to invalidate my point...but, when you look closer, multiplatform titles present also on PS3 are sometimes built on middleware engine; and often simply end up not very good on eit

    • With that price, I wonder if AMD can compete on the performance pre price scale.

      E.g. I bet you get a ton of AMD CPUs for the price of one high-powerful Core i7.

      What we need, is 4-8 socket mainboards!

  • Would like to get one of these discount quadcore AMD processors, don't know which is the best option for a Hackintosh, though. Any recommendations? My current hackintosh is Intel-based and I don't know how tricky it might be to configure a working AMD--based hackintosh. Links appreciated.

    Seth
    • Yes, something with an intel processor. Save yourself the headaches, its not worth the savings.

    • Despite the above comment being made by an intel troll you will need an intel system to run osx. Most any of the cheap intel chipsets (like the g31) will work with osx.
      • by jedidiah ( 1196 )

        The Hackintosh has to look as much like a real Mac as you can manage.

        Of course that means Intel only for the CPU.

        An alternate ATI or Nvidia GPU (beyond what's in actual shipping Macs) might even be a problem.

    • Re: (Score:2, Informative)

      by Anonymous Coward
      Honestly there is a very big learning curve you're going to need to overcome in order to get a reliable hackintosh build on an AMD platform. To be honest, your best bets are an intel (socket 775) based platform. Various motherboards there will have issues (usually audio), not to mention video cards. AMD chipsets tend to have more issues still. If you want to go with a Mac, I'd just suggest starting with a mini, if that doesn't suit your needs bump up to an intel based hackintosh.

      Don't get me wrong, i
      • by sznupi ( 719324 )

        So, would there really be any difference if Nvidia board similar to Intel 9300 ones was the variant...for AMD?

    • Please visit the OSx86project wiki page to have those questions answered. I'll tell you off the bat that you will have to patch the kernel, which already puts you at risk for a world of hurt if your other components don't play nicely either.

  • by judolphin ( 1158895 ) on Tuesday January 26, 2010 @08:14PM (#30912488)
    For example, AMD outperforms Intel pound-for-pound in graphics and video rendering (which would make sense since they acquired ATI). If you're building a media center, get a computer with an AMD processor. It's one of the few things in life where cheaper is better.
  • I've been out of the PC building loop for years since I moved to a mac, and while I still plan to keep a mac laptop, I am studying scientific computing in grad school starting in April and was looking to buy a beefy linux machine. These CPUs look really interesting, but I was hoping not to have to buy a tower to house it in. Do these things run alright in a mini-case? Any suggestions on a good small case to house one of these in?
    • The case size doesn't matter so long as you can pull enough air through it. I look after a few 1U sized servers with 2 twin quad core machines in each fed by the same power supply in between the motherboards. You could almost use the things as a hair dryer.
    • by WuphonsReach ( 684551 ) on Tuesday January 26, 2010 @09:00PM (#30912790)
      Go with a microATX motherboard (preferably one that only uses heat pipes and no moving fans, like the Asus boards). Use one of the 45W TDP AMD chips - which are dead easy to cool, even in confined spaces and the stock fan runs pretty much silently.

      As for the case... I don't have a suggestion for that at the moment.

      (Best place to pickup the AMD CPU & MB is over at MWave since they'll bundle it, assemble the CPU and RAM onto the MB, and test it for you. So you're never left holding a bag full of incompatible parts.)
    • by JanneM ( 7445 ) on Tuesday January 26, 2010 @09:31PM (#30912946) Homepage

      Like you I need a Linux machine for work-related computing-intensive work, so I assembled one last fall. I use a decent quality MicroATX case with the Gigabyte MA785GPMT motherboard and the Phenom 2 X4 955. Add 8Gb of memory, a drive and you're set. I was going to add a separate graphics card at one point but so far I actually use the on-board graphics, with the 2d-only free drivers. I don't need speedy graphics for showing terminal output and static graphs after all.

      The system came in cheap, it's really quiet and it's surprisingly speedy. True, it's barely half the speed of the 8-core Xeon machine I have at work - but at only an eighth of the cost.

      My only advice is, don't go too cheap on the case. That's the single most important part for determining the noise level, and there's nothing so irritating as having a constant high-pitched whine from under your desk all day long.

      • Re: (Score:3, Insightful)

        by apoc.famine ( 621563 )

        You're modded up, so I'll just add a +1 comment to your observation on the case. For the last decade or so, the most expensive part of my systems has usually been the case. Of course, there were only about 3 of them in that decade. A good case is a must.
         
        So far, I've been a loyal Antec fan. Roomy, rolled edges, rails for everything, good ventilation...I have no complaints about their cases. They are damn well built.

    • My HTPC has an Athlon II X4 620 running pretty well in a small antec HTPC case [newegg.com] on a 785G mATX motherboard on a 350W PSU- probably wouldn't handle a discrete graphics card (integrated ATI GPUs handle 1080p h264 playback fine), but it wouldn't fit anyway. Also, my work PC is an intel core i7 920 housed inside a shuttle. Quad-cores are now being squeezed into laptops too. Needless to say, there's hardly any need for a large tower with loud cooling, unless you need the space and/or want to overclock.
      • A half-height discreet card will fit in that case just fine, and you'd see a nice graphics boost if you like your HTPC to do double duty.

    • The lower-power Phenom II chips have the lowest TDP around; functional units and whole cores can be powered down when not in use. You're not likely to see the latter while using your machine, except that the Phenom II X3 processors make use of it so that the disabled core doesn't even cost you any power.

  • Intel v AMD (Score:5, Informative)

    by m.dillon ( 147925 ) on Tuesday January 26, 2010 @09:56PM (#30913084) Homepage

    I build new boxes every 6-8 months or so and rotate them into production boxes to make room for the next set. Until recently the Intel chipsets were ahead of the game vs the AMD chipsets with regards to things like E-SATA, AHCI, and PCI-e. AMD has caught up in the last 8 months, though. High-end Intel cpus tend to be a bit faster than high-end AMD cpus and you can also stuff more memory into small form-factor Intel boxes vs small form-factor AMD boxes.

    On the flip-side, AMD boxes tend to be cheaper all-around and aren't quite so gimicky when it comes to managing cpu speed vs heat dissipation. Whole systems based on AMD seem to eat less power and from a cost standpoint when running systems 24x7. Power is getting to be quite important.

    If you are trying to create the fastest, highest-performance box in the world Intel is probably your game (and for graphics you are going to be buying a 16x PCI-e card anyway with that sort of setup).

    If you ratchet down your expectations just a bit, however, you can slap together a very good box with AMD at its core for half the price and 85% of the performance, and that is going to be plenty good enough for just about anything considering how overpowered machines have gotten in the last few years vs what people actually run on them.

    Personally speaking I see no point purchasing the absolute bleeding edge when it is just going to become non-bleeding edge in 8-months when I can instead purchase two of something just slightly behind the bleeding edge at a much lower price.

    These are just my observations.

    -Matt

    • Is that AMD chipsets have been buggy in my experience. Well, for the most part it seems like there haven't been actual chipsets made by AMD, they've always been third party like nVidia, VIA or ATi. At any rate they seem to have bugs, sometimes minor, sometimes severe. The worst was back with the original Athlons, I got one and could not make it work with my GeForce 256. I found out this was because the AGP bus was out of spec and didn't work the GFs at all.

      That is one of the main reasons I've stuck with Int

      • by sznupi ( 719324 )

        Well, if you want to go that far, to Athlon and GF256 days, Intel had their share of problems too... (unstable P3 Coppermines just above 1GHz mark? Flaky motherboards with Rambus chipset & bridge?)

        That said, yes, many chipsets for AMD had problems - but you could always find something solid. SiS chipsets which you don't seem to even remember were particularly impressive - perhaps slightly slower than Via, Nv or ATI alternatives, but absolutelly rock-solid and troublefree, on par with Intel (for example,

  • by Nom du Keyboard ( 633989 ) on Tuesday January 26, 2010 @10:33PM (#30913290)
    There are less "garbage processors" from AMD. Less intentionally crippled varieties that are missing little bits of this and little bits of that compared to what Intel offers. With Intel I always have to read the fine print on every processor to see if it supports virtualization extensions, for only one example.
  • by Tamran ( 1424955 )

    It's a pretty decent/entertaining review. He also speaks about over clocking.

    http://www.youtube.com/watch?v=CNcE3GND3sQ&feature=sub [youtube.com]

  • by cenc ( 1310167 ) on Tuesday January 26, 2010 @10:42PM (#30913350) Homepage

    You know, 1 core, 2 cores, 3 cores, 1,000000 cores I have realize means exactly jack if the data they need to crunch is still sitting on frigen hard drive.

    My processors and I would do flips and flops, if we could just get some dam data off our drives. Come on? We have basically not had a real leap in hardrive speeds or technology in how many years?

    I mean solid states and all are great, but they still have a long way to go. What happens when we need to start pushing terabytes like megabytes?

    We got a ram and catch arms race going on because, the hard drives suck and no one seems to be doing anything about it.

    The best we can do are raid tricks to get any more performance (or reliability for that matter), and that has well known limits and problems.

    • by jedidiah ( 1196 )

      +...when video processing is no longer a highly CPU bound activity, you will have a point there.

      Until then... not so much.

      You may be hard pressed to stress the network with a number of clustered multi-core machines. Nevermind overwhelming disks or HBAs.

    • Uhhh... are you aware of SSDs....? Admittedly they are only an order of magnitude better in some respects right now..... but they are relatively in their infancy. With the new chips being made on silicon SSD tech should not curve very close to CPU tech....
    • You know that there is this technology called RAM, right?

      Also, there is cache. And nowadays, opimization consists of fitting your algorithm in the cache. Then when it’s done, you take the next block. So what back in the days was swapping (to disk) is nowadays swapping (to RAM).

      I recommend you buy yourself as much RAM as you can fit into your mainboard. That should help.

  • AMD (Score:5, Insightful)

    by yoshi_mon ( 172895 ) on Wednesday January 27, 2010 @12:02AM (#30913760)

    If nothing else AMD serves to counterpoint Intel from being a monopoly. Further they actually make some pretty good chips.

    I support AMD because they keep Intel in check. And as a bonus their chips aren't that bad.

  • Their benchmarks seem decent. [cpubenchmark.net] The Athlon II X4 620 is a solid performer.

    And the Athlon II X4 630 2.8Ghz 4-core processor is getting great reviews at newegg [newegg.com]with good potential for overclocking, even with the stock cooler.
    br> There's a few great motherboard/CPU combo deals going on right now at newegg. QuadCore for $170 [newegg.com] and dual-core for $90. [newegg.com]
  • by keeboo ( 724305 ) on Wednesday January 27, 2010 @01:55AM (#30914288)
    Again this silly fight between AMD vs Intel.
    When people are going to learn performance _depends_ on what you're going to process?

    I remember, few years ago, having a server we had with an Athlon XP 2600 (its real clock was 2.1GHz AFAIR). A perfectly speedy machine for desktop usage, but as a server (pure CPU-load in that case, no I/O bottleneck) it was having a real hard time. We eventually replaced that machine and old 4x Xeon (P3-based, 500MHz), and things went to normal.
    I already suspected what the problem could be, so I've decided to make a test replacing - temporarily - the Xeon-based server with a Sun Ultra 30 (1xUltrasparc II @ 300MHz).
    Well, the Sparc not only survived the test, but also kicked hard the Athlon's ass. Still, as a desktop machine, the Sparc was mediocre.
    The difference was that the Sparc had 2MB L2 cache, while the Athlon had only 256kB (even with 2x bandwidth and lower-latency RAM). In _that_ case the L2 cache made all the difference. Per MHz, the Sparc also won, by large margin, the Xeon machine (1MB L2 for each processor).

    Athlon's (pre-64) performance compared to P4 (sorry, I don't have an i7 to compare against a X4) varies. For desktop usage the Athlons felt snappier in general, but with some performance "hiccups" when you started to tax the machine more. The P4s felt slower overall, but the performance seemed to be more homogeneous.
    Which one was better then? Well, that's a good question. I personally preferred the "slower but smoother" P4, but Athlons were fine and I could recommend both processors for home usage,

    You know what really, really suck?
    Those benchmarks they publish around.

    I mean "XYZ fps in Crysis"? mp3 lame encoding time? Synthetic benchmarks?
    Those say nothing to me. Run some database benchmark, or measure the time it takes to compile the Linux kernel using all cores at once... Or move GBs of data in the memory N times etc. Then it might be interesting.
    • by jregel ( 39009 )

      Wish I had some mod points today...

      Someone please mod the parent up as insightful!

  • For everything else, AMD's price/performance ratio can't be beat, Intel's superior marketing notwithstanding. It would cost me twice as much money to get an Intel processor and a decent Intel chipset mobo for the desktop I'm running right now. Quite frankly, I think this price differential is much better spent on a 128GB solid state drive.

If all else fails, lower your standards.

Working...