Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Graphics Hardware

AMD Ryzen 5000G Series Launches With Integrated Graphics At Value Price Points (hothardware.com) 69

MojoKid writes: AMD is taking the wraps off of its latest integrated processors known as Ryzen 7 5700G and the Ryzen 5 5600G. As their branding suggests, these new products are based on the same excellent AMD Zen 3 core architecture, but with integrated graphics capabilities on board as well, hence the "G" designation. AMD is targeting more mainstream applications with these chips. The Ryzen 7 5700G is an 8-core/16-thread CPU with 4MB of L2 cache and 16MB of L3. Those CPU cores are mated to an 8 CU (Compute Unit) Radeon Vega graphics engine, and it has 24 lanes of PCIe Gen 3 connectivity. The 5700G's base CPU clock is 3.8GHz, with a maximum boost clock of 4.6GHz. The on-chip GPU can boost up to 2GHz, which is a massive uptick from the 1.4GHz of previous-gen 3000-series APUs.

The Ryzen 5 5600G takes things down a notch with 6 CPU cores (12 threads) and a smaller 3MB L2 cache while L3 cache size remains unchanged. The 5600G's iGPU is scaled down slightly as well with only 7 CUs. At 3.9GHz, the 5600G's base CPU clock is 100MHz higher than the 5700G's, but its max boost lands at 4.4GHz with a slightly lower GPU boost clock of 1.9GHz. In the benchmarks, the Ryzen 5 5600G and Ryzen 7 5700G both offer enough multi-threaded muscle for the vast majority of users, often besting similar Intel 11th Gen Core series chips, with highly competitive single-thread performance as well.

This discussion has been archived. No new comments can be posted.

AMD Ryzen 5000G Series Launches With Integrated Graphics At Value Price Points

Comments Filter:
  • That's marketing-speak if I ever heard it. Translated, it means "it's not cheap, but we thought about making it even more expensive."

    • by ravenshrike ( 808508 ) on Tuesday August 03, 2021 @08:36PM (#61653437)

      They're selling at a price point in which it makes sense to get unless you going to shell out for a decent GPU. Which in the current market adds another 250 bucks minimum.

      • by AmiMoJo ( 196126 )

        Current AMD CPUs with on-board GPUs are ridiculously good value. The GPUs are powerful enough for many games, particularly eSports, so you can use them until discrete GPU prices calm down a bit. They don't cost much more than the non-GPU variants either.

        They are good for servers as well because you can use the GPU for compute stuff and for a basic display. Transcoding is a common example.

  • by Ostracus ( 1354233 ) on Tuesday August 03, 2021 @07:31PM (#61653141) Journal

    Good backbone for a modern NAS.

    • Comment removed based on user account deletion
    • Don't these "G" chips that have graphics lack ECC support?
      if you're using ZFS it's highly recommended to go ECC.

      • There's nothing magical about ECC specific to ZFS. The only reason to use ECC is to close a final possible gap in corruption. Here's a hint: if you're not currently using a checksumming filesystem, then using ZFS without ECC RAM is a huge improvement already.

  • by blahbooboo ( 839709 ) on Tuesday August 03, 2021 @07:31PM (#61653143)
    It's really so frustrating the CPU names used by Intel and AMD. Used to be able tell performance quickly using name and maybe a model. Now when I am comparing CPU i have to look up and review online charts to see which is what generation, speed, or number of cores/threads.
    • by antdude ( 79039 )

      It's not just CPUs too. :(

    • by AmiMoJo ( 196126 )

      What's difficult about the AMD naming?

      The generation is denoted by the thousands, so 5000 series is newer than 4000 series. The hundreds are the tier within that generation, so 5700 is better than 5600. Finally they stick a G on the end if it has integrated graphics.

      Nvidia are one of the worst when it comes to naming products.

      • by thegarbz ( 1787294 ) on Wednesday August 04, 2021 @07:30AM (#61654715)

        The generation is denoted by the thousands

        Actually that's where you're wrong and that's kinda the point.

        Here's a quick test. Which generation are the following:
        Q: Ryzen 5 3600, vs Ryzen 5 4600GE
        A: The same generation.

        Q: Ryzen 3 2300X, vs Ryzen 5 2400G
        A: Different. The 2300X is a Zen+ chip whereas the 2400G is the previous generation architecture.

        Q: Ryzen 5 5600X, vs Ryzen 5 5600G
        A: The same generation.

        You want to know the first thing I did when I read TFS? I googled to see which generation architecture is under these APUs because AMD fucked up this simple to understand numbering scheme. Though I have high hopes that this is finally an indication that they left their stupidities behind them.

        Nvidia are one of the worst when it comes to naming products.

        Hardly. Intel is one of the worst. With NVIDIA you get exactly what you just described as from AMD, including the brain dead practice of mixing architectures across product numbers (NVIDIA GTX1660 and RTX2060 are both Turing, just like a AMD 2300X and a 3400G are both Zen+)

        • by AmiMoJo ( 196126 )

          Huh, thanks, I was not aware of that. That's some BS... But now you mention I do recall Gamers Nexus saying this a while back in a video.

    • At least the AMD numbers use the same thousands number for the same generation between mobile and desktop. Having 3xxx for desktop and 4xxx for mobile was wonderful for clarity.

  • The first 1,000,000,000Hz CPU..

    March 6, 2000 - (ZDNET) AMD just broke an important processor clock speed barrier.

    Advanced Micro Devices Inc. announced Monday that it is shipping a 1GHz, or 1,000MHz, Athlon processor.

    Time flies....

    • Yeah, x86 CPU speed went from 33 MHz (Intel) in 1990 to 1000 Mhz (AMD) in 2000.

      A 30x speed increase. I guess the 90s kicked the most ass for CPUs.

      • I never experienced a bigger jump in CPU power/speed than when I upgraded from an 8MHz 8088 to a 12MHz 80286. You'd think it would only be roughly 50% faster, but it felt more like 500%.

        • The closest related experience recently has been going from an HDD to an SSD.... that was an impressive jump.

    • by sconeu ( 64226 )

      I remember. I bought a 1.1GHz Athlon system in early 2001.

    • And last week when I was programming a 1Mhz PDP 8 with 4k words of memory.
    • I built a SLOT-A Athlon system in 2000 just to be part of the 1GHz club: https://www.cpu-world.com/CPUs/K7/AMD-Athlon%201000%20-%20AMD-K7100MNR53B%20A.html

      I still remember doing kernel compiles on it in Slackware and just being shocked at how much faster it was than my Pentium II 233MHz system.

      I got that same feeling last year going to Threadripper, 20 years later. AMD might not always keep the performance lead, but when they take it, they take it with style.

  • I remember in the 80s and 90s, not just transistor counts, but actual CPU speed used to double every two years. Now we are at 20% bump every 18 months. I sure hope that in the 2030s we won't be at a brick wall. The way things are going, I don't know. Hopefully, it would be enough for 12K per-eye VR by then since we are about 2 to 4 doublings from that.

    • CPUs MHz has been roughly stagnant for the past decade. That's because Silicon doesn't reliably scale past 5 GHz.

      Back when we had CPUs at 4 MHz it was relatively "easy" to run them at higher clock speeds -- shrink the die and manage heat, to greatly simplify.

      • Re:20% improvement (Score:5, Informative)

        by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Wednesday August 04, 2021 @12:40AM (#61654011) Homepage

        The speed of electricity in silicon is a tad more than 2/3 c (2/3 speed of light in a vacuum). That is about 20 cm in 1 ns (nano-second). So the max distance between parts of a 1 GHz CPU is 20 cm and a 5 GHz CPU is 4 cm. The tracks in a CPU are not straight lines. Add to this gate delay (every time the signal goes through AND or OR gates, etc); this is a few pico-seconds per gate; there can be a few dozen in a critical path. Add in capacitive delay ... you get the idea.

        So one way to up the GHz rating is to make CPUs smaller - but that makes it harder to dissipate the heat generated.

        • Thanks for the information!

          Back in 2007 IBM managed to get a Silicon CPU, normally running at 350 GHz at room temperature. running at 500 GHz with liquid cooling.

          What is the reason for GaAs / etc. being used in 500+ GHz CPUs -- better thermals then silicon? speed of electricity? Combination?

  • Are these actually going to be stocked at Amazon on Aug 5th, or is this a hypothetical release of processors some time after OEMs stop buying the parts and they have some excess capacity on the production lines?
    • It's been available to OEMs for the past 6 months. That being said, it will probably be popular enough to sell out pretty quickly because the graphics card market is still inflated by 200% or so.

  • Why don't they release good APUs? They're using their older GPU architecture and keeping the chip to a TDP that's very low considering this is both a CPU and a GPU.

    Consider if the 65W TDP was raised to 125W then surely they could triple or quadruple the number of GPU cores and have a killer APU.

    • How do you consider it rubbish? They work fine for their intended purpose, which is general productivity and casual gaming. I played Doom Eternal just fine on my 3200G.

      Unless you are looking for 1337 powa, then that is not what they are designed for.

      • by MrL0G1C ( 867445 )

        I know the actual answer to my question, and that is they don't want to cut in to their video card market. It's a shame, they could make far better APUs but they won't.

        And Doom Eternal isn't demanding, 3200G can't cope with most modern AAA games unless you lower the settings and the resolution a lot.

        • I know the actual answer to my question, and that is they don't want to cut in to their video card market. It's a shame, they could make far better APUs but they won't.

          And Doom Eternal isn't demanding, 3200G can't cope with most modern AAA games unless you lower the settings and the resolution a lot.

          People that care about gaming are going to buy a GPU. It would be a very small market segment that would want a gaming APU.
          Personally, I'd rather the GPU was just enough for a functioning desktop and the CPU had as many cores as possible.

          • by MrL0G1C ( 867445 )

            I don't agree that everyone wants or can afford a dedicated GPU, they're expensive. Not all of the market is hardcore FPS gamers there is a middle ground that would likely be happy with an APU that is better than the current 5700G and has 6 cores since there are rapidly diminishing returns in current games with more than 6 cores.

            Also, AMD can cater for more than one market, it doesn't need to make all APUs the same which is your implication with the GPU is just enough for the desktop statement.

            "as many core

        • Yes.... it is for CASUAL gaming, by design. What you want is NOT casual gaming, so you have to pay for a GPU.

          The integrated graphics work exactly as expected, what you are expecting is the same as expecting a chain takeaway store to serve restaurant quality food. Everyone knows what they are getting with a APU, casual gaming, not "triple A games with 4k resolution at 60+ fps". (or whatever pro gamers like to have)

          You can use an APU to play most games, at 1080p, mid-range options.

          • by MrL0G1C ( 867445 )

            You say this like there can be no middle ground and no diversity in products, therein lies the flaw in your argument.

            If AMD made a better APU people would find out about it, budget gamers would very likely buy it in droves. System builders would love it, dell would have an orgasm.

            Not all casual gamers want crap framerates and not all serious gamers can afford an RTX30XX.

            • TL:DR If AMD has not done it yet, there is a good reason. Find that reason and you have your answer.

              Not at all. The middle ground is what currently exists, what you are asking for is not realistic (or more correctly, would require significant motherboard power and thermal management changes)

              Your argument is that casual gamers NEED more than what the current APU's provide, which is incorrect. If you "must have" decent frame rates and high options on AAA games, then by definition, you are not a "casual" gamer

    • It has to fit in the same socket as their non-APU chips and gets the same memory bandwidth. Throwing more and more graphics cores at the same memory bandwidth is a diminishing return.

      • by MrL0G1C ( 867445 )

        That's a good point, but where is that point of diminishing returns, I have no idea, I still think the APUs are no-where near that point.

      • It has to fit in the same socket as their non-APU chips and gets the same memory bandwidth. Throwing more and more graphics cores at the same memory bandwidth is a diminishing return.

        Your thesis requires that the non-APU chips are using all the pins, something that is not accurate.

        Hint: They employ a few hardware engineers. Who understand the sockets.

    • Comment removed based on user account deletion
      • by MrL0G1C ( 867445 )

        IDK about that but they are competing by selling Video cards and CPUs so I doubt that's true.

        Also the Steam Deck will have RDNA cores and Valve claims it will be a very capable device.

        I think the real reason is AMD are being overly cautious and not testing the waters to see if there is a market for a more powerful APU. Maybe if the steam deck sells like hot cakes they will reconsider. What I think AMD hasn't adequately taken in to account is that they can take some of nVidias lower range market with a good

      • With as many console APUs as they have been making, they need a market for the binned parts. Which by necessity will have fewer compute units, because they didn't all pass quality control.

    • Consider if the 65W TDP was raised to 125W then surely they could triple or quadruple the number of GPU cores and have a killer APU.

      Would they though? Genuine question but aren't they going to run into serious diminishing returns due to running out of bandwidth because of using the system memory, rather than dedicated GPU memory (which tends to trade higher latency for much higher throughput).

    • Consider if the 65W TDP was raised to 125W

      125W TDP at the node sizes they are using pretty much necessitates the use of advanced cooling due to the heat source being so concentrated on the die. It would be a killer APU for sure, but the Venn diagram of people who use advanced cooling solutions like AIOs and people who buy APUs doesn't really overlap very much.

      • by MrL0G1C ( 867445 )

        Here is why AMD should do this:
        https://store.steampowered.com... [steampowered.com]

        Nvidia are completely dominating at all levels, putting together a good APU would generate a lot of buzz, it would be a chip that system integrators and budget builders would love. Paired with DDR5 AMD could create a chip that would take the market by storm.

        APUs right now are rather dull, I'd love to see that change, I'd like to see AMD make one with the RDNA2 (or beyond) more compute units and a stipulation to use reasonably fast memory.

        Valve

        • I'm not sure what you think you're promoting there. Anyone in that list would be better served if AMD produced and promoted its existing GPUs more. That list isn't some market base for people who aren't interested in GPUs and again it doesn't resolve the issues of putting 125W out of a 7nm node.

          Valve Steam Deck will be interesting to see, unlike the APUs AMD is normally selling the Steam Deck has RDNA2 instead of VEGA.

          And it also has an incredibly low resolution display to drive. Don't hold your breath that this is some amazing feat of a chip. There's some laws of physics you're ignoring if you think that a bit of an extra power b

  • AMD has yet to announce the 35w GE series availability to the public.

    I have been waiting over a year, since before the 4000 series to upgrade my home theatre box with a low power integrated GPU CPU from AMD to replace my very old Sandybridge setup. My 4K OLED TV doesn't look very good with the screwy video out from that ancient Intel box.

    I'm still waiting, and I'm getting pissed.

  • Twenty years ago the sound card market was decimated by "Good Enough" integrated audio chip sets. Sure the Sound Blaster cards of the time performed better on every level, but no one wanted to drop $100 on a sound card, when the integrated sound was good enough for 90% of the people buying computers.

    This is a perfect time for Intel to up their integrated graphics game. Intel does not have to design a graphics chip set that will compete with Nvidia or AMD's top of the line video cards. They do not need to
    • Intel is going to start selling dedicated GPUs, but I'm not sure that's what you are talking about.
    • Intel does not have to design a graphics chip set that will compete with Nvidia or AMD's top of the line video cards.

      Isnt that exactly what they have done with their current integrated gpus? anyone picking a gpu probably wouldnt be able to do what they want with integrated graphics anyway, theyre seperate markets

  • AMD is sleeping in their laurels, and whitholding NAVI iGPUs because there is no competition in the space.

    How long before Intel catches up with them in the iGPU space?

    Stay tuned.

  • Over a break at the end of last year, I built myself a new home PC, using an ASRock Z390m-ITX/AC [asrock.com] motherboard, an Intel Core i9-9900 [intel.com] CPU, 32Gb RAM and a 2Tb Samsung M.2 SSD... all housed in a fanless Streacom FC8 [streacom.com] case. The 3 display ports on the motherboard are perfectly capable of giving me 60Hz refresh on a triple set of Gigabyte Aurus F127Q-X [gigabyte.com] 27" monitors.

    That's a desktop resolution of 7680x1440, with stunning picture quality. Running Prime95, I saw an initial turbo-boost clock speed of 5.00GHz across
  • Up until now AMD's APUs have been named inexplicably higher than the generation of it's CPUs.

    E.g. Zen+ CPUs: 2000 series. APUs: 3000 series.
    Zen2 CPUs: 3000 series. APUs: 4000 series.

    It's good to see Zen 3 CPUs and APUs both sharing the 5000 series number. Honestly what they were doing up to this point was just dishonest from a marketing perspective.

  • I'm not sure why AMD is still pushing the stillborn Vega architecture on their modern chips when RDNA2 is such a better alternative. Especially when Intel xe is blowing the pants off of Vega in graphic benchmarks.

      Is it because RDNA2 requires DDR5 at minimum?

  • The heading mentions value price points, but the summary does not..

    -'journalism'
  • Can you people go back to making the best cpu's and stop trying to integrated everything including the kitchen sink with my cpu please. If i want a graphics card i will buy the one i want, not use some half baked rubbish integrated and paid extra for. I don't want extra cost or extra transistors on my cpu that are a complete waste of time simply because some stupid MBA wants to increase the profit off his product with something no one will use. Otherwise you can go die like Intel is now.
    • by malkavian ( 9512 )

      Very glad you don't run the company.
      The bulk of the market these days doesn't much care about discrete graphics cards anymore. General office machines just don't need that kind of power, especially when the entire use case of the office boxes is spreadsheets, word processing and general applications.
      Building a cut down GPU that can handle basic (for a modern system) acceleration of all the requirements into the base CPU and having the cost of that be fractionally higher cuts down on the overall cost of the

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...