Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Hardware Technology

AMD's and Nvidia's Latest Sub-$400 GPUs Fail To Push the Bar on 1440p Gaming (theverge.com) 96

An anonymous reader shares a report: I'm disappointed. I've been waiting for AMD and Nvidia to offer up more affordable options for this generation of GPUs that could really push 1440p into the mainstream, but what I've been reviewing over the past week hasn't lived up to my expectations. Nvidia and AMD are both releasing new GPUs this week that are aimed at the budget PC gaming market. After seven years of 1080p dominating the mainstream, I was hopeful this generation would deliver 1440p value cards. Instead, Nvidia has started shipping a $399 RTX 4060 Ti today that the company is positioning as a 1080p card and not the 1440p sweet spot it really should be at this price point.

AMD is aggressively pricing its new Radeon RX 7600 at just $269, and it's definitely more suited to the 1080p resolution at that price point and performance. I just wish there were an option between the $300 to $400 marks that offered enough performance to push us firmly into the 1440p era. More than 60 percent of PC gamers are playing at 1080p, according to Valve's latest Steam data. That means GPU makers like AMD and Nvidia don't have to target 1440p with cards that sell in high volume because demand seems to be low. Part of that low demand could be because a monitor upgrade isn't a common purchase for PC gamers, or they'd have to pay more for a graphics card to even support 1440p. That's probably why both of these cards also still ship with just 8GB of VRAM because why ship it with more if you're only targeting 1080p? A lower resolution doesn't need as much VRAM for texture quality. I've been testing both cards at 1080p and 1440p to get a good idea of where they sit in the GPU market right now. It's fair to say that the RTX 4060 Ti essentially offers the same 1440p performance as an RTX 3070 at 1440p for $399. That's $100 less than the RTX 3070's $499 price point, which, in October 2020, I said offered a 1440p sweet spot for games during that period of time. It's now nearly three years on, and I'd certainly expect more performance here at 1440p. Why is yesterday's 1440p card suddenly a 1080p one for Nvidia?

This discussion has been archived. No new comments can be posted.

AMD's and Nvidia's Latest Sub-$400 GPUs Fail To Push the Bar on 1440p Gaming

Comments Filter:
  • by HBI ( 10338492 ) on Wednesday May 24, 2023 @09:45AM (#63547915)
    I know that my tolerance for spending a lot for a monitor is low. Even the high resolution 4k displays on current laptops are hard to read when you get older. You can muck with scaling and such to make them usable, but I suspect that the 1080p monitor market will remain healthy for quite a while yet. There's probably a sweet spot here, beyond which monitor resolution doesn't seem that relevant to most.
    • Chicken and egg, though. Most of the displays are 1080 or 4k because TVs, but if GPUs had better 1440p support then you'd probably see more people buy those monitors and drive the price down. Since high DPI monitors are STILL problematic today, there's a good argument to be made for buying different resolutions for differing size ranges. Most displays are assumed to be somewhere between 72 and 96 PPI depending on the era you're coming from (72 being the standard for 68k Macintoshes) so it's best to aim for

      • by omnichad ( 1198475 ) on Wednesday May 24, 2023 @10:16AM (#63547997) Homepage

        That egg is laid by a golden goose. The reason 1440p is a metric is that someone wants to drive sales of QHD monitors and displace existing 1080p screens. If the picture can still improve without the higher resolution, why bother? And that's the case - these new GPUs can't hit high frame rates at high resolutions specifically because the new features improve the picture quality and nobody wants to turn them off. With those turned off, 1440p would work great.

      • Comment removed (Score:4, Insightful)

        by account_deleted ( 4530225 ) on Wednesday May 24, 2023 @12:53PM (#63548387)
        Comment removed based on user account deletion
        • I hate to agree with HBI but I think he made the right point: to most people, the difference between 4K, 1440P, and 1080P, is not sufficient that it's worth the extra expense in resources getting the higher resolution monitors.

          I'm using a 40" 1080P TV (Sony Bravia) that I got for $50 as a monitor. This is 55 PPI. Do you have any idea how terrible text looks at 55 PPI? Well I'll tell you, it looks like goddamned garbage. But a 40" 4k would be 110 dpi, which is a bit excessive. Admittedly, a 40" is slightly excessive itself, I actually have to turn my head to focus on some parts of it. But it is pretty cool for games, filling more of my vision results in a more immersive experience for sure, but subpixel AA is super noticeable.

          Obv

        • This is why scaling was invented. Also a DPI issue since resolution is tied to screen size and the distance you view it from.

          My eyes aren't great either after nearly half a century, but 27" 1440p at arms length is right about where my limit is without turning up scaling.

          The thing about scaling is that really only matters for reading text. For everything else more pixels is more better.

          The question I find myself asking,...myself, is do I even need 4K to 'justify' the 4090 I'm running, because at 1440p I have

        • I hate to agree with HBI but I think he made the right point: to most people, the difference between 4K, 1440P, and 1080P, is not sufficient that it's worth the extra expense in resources getting the higher resolution monitors.

          I challenge this. For many people they very much do want higher resolution monitors and the difference is there since they would make a corresponding jump in size too, but in many cases they are priced out of reach for the quality they expect. Sure you can get some cheap 4K monitors, but they are trash. As soon as you go to something with acceptable video quality they start getting very expensive. I looked at replacing my 10 year old 1000EUR wide gamut monitor, and the 1440p equivalent monitor a single inch

    • by mobby_6kl ( 668092 ) on Wednesday May 24, 2023 @10:49AM (#63548079)

      1440p on the desktop isn't a very high resolution or expensive to buy, a good IPS display is like $250 and you can probably find some cheaper TN garbage.

      I'm sure 1080p will stay here as a budget option for a while but it seems to be finally going the way of the dodo. There's really not much reason for it to exist.

      • by eth1 ( 94901 )

        Came here to say something similar: IMO, 1440p *is* the sweet spot right now. You get some extra pixels for desktop app real estate, monitors aren't crazy expensive, you can actually go up to 10ft on displayport cables at that resolution without going out of spec, and mid-range graphics cards have no trouble with it. (In fact, my old GTX 970 was fine for years with 1440p).

        Going up to 4k locks you into expensive graphics cards for not too much benefit, and 1080p has always felt cramped, especially in regards

      • a good IPS display is like $250

        No, a crap ISP display is like $250. A good 1080p ISP display is like $250, same price as it was close to a decade ago.

    • by tlhIngan ( 30335 ) <slashdot&worf,net> on Wednesday May 24, 2023 @10:52AM (#63548091)

      1440p is also known has QHD, or "Quad High Definition". It refers to 4x 720p resolution (720p is 1280x720), so 4 times the resolution is 2560x1440.

      It's why "4K" is called "Ultra HD" - they couldn't use "Quad HD" because it was stolen by the 1440p crowd, even though it is 4x 1080p (1080p is 1920x1080, UHD is 3840x2160).

      It's important because lots of people struggle with the belief that 720p isn't "HD". Especially on TV - 720p is used for sports because 720p60 takes the same video bandwidth as 1080i60. So you would use 1080i for stuff needing higher resolutions but lower framerate, while 720p is used for things that need high framerate but not so much resolution. E.g, sports. Any sports game needs 60 progressive frames because of its fast action so you compromise on resolution (you don't need that many pixels to recognize a jersey or a ball) but get it back in frame rate.

      1440p is a troublesome resolution because it's not a supported resolution of CEA-861, which specifies the supported resolutions of consumer electronics and their timings.

      • by ljw1004 ( 764174 )

        So you would use 1080i for stuff needing higher resolutions but lower framerate, while 720p is used for things that need high framerate but not so much resolution. E.g, sports. Any sports game needs 60 progressive frames because of its fast action so you compromise on resolution (you don't need that many pixels to recognize a jersey or a ball) but get it back in frame rate.

        I think you actually do need more pixels to recognize a jersey. I watch soccer from time to time. Here's a typical camera view: https://youtu.be/3gm_Uyhoj9s?t... [youtu.be]
        * At 720p, player is 48 pixels high, number on jersey is 9 pixels high,
        * At 1080p, player is 73 pixels high, number on jersey is 14 pixels high
        * At 1440p, player is 98 pixels high, number on jersey is 19 pixels high
        * At 4k(2160p), player is 146 pixels high, number on jersey is 28 pixels high

        We usually get oblique angles on the numbers on the jersey.

        • Soccer is a special case though, because the pitch is so god damn big. I remember in the pre-HD days it was like watching ants kicking an aspirin around the field because of how far away they have to put the cameras in order to be able to cover the game. At least now you can make out a number if you squint at it just right due to much higher resolution and far better optics on the cameras.

          HDTV made watching hockey on TV a plausible thing to do. In shitty old NTSC you were never going to actually see the

      • by edwdig ( 47888 )

        Any sports game needs 60 progressive frames because of its fast action so you compromise on resolution (you don't need that many pixels to recognize a jersey or a ball) but get it back in frame rate.

        I'm used to watching Yankees games in 1080. The extra resolution is great. I remember in SD any time there was a close play, you'd watch the replay and it would all be a blur. Now with the high resolution you can clearly see exactly when the ball hits the glove and when the runner hits the base. Having the high resolution is great.

      • Especially on TV - 720p is used for sports because 720p60 takes the same video bandwidth as 1080i60.

        Um... My Cable TV provider has a "4K" option, which, for a very small fee (about a dollar, if I remember correctly), offers you a couple 4K channels. I didn't take it because one was something I didn't care for, and the other was a... sports channel. My previous cable TV provider also had a 4K option: one TV channel... it was a sports channel.
        And I live in a 3rd world country.

      • 1440p is a troublesome resolution because it's not a supported resolution of CEA-861

        This sounds like the kind of thing that isn't relevant to anyone seeing how there are many 1440p displays on the market and no reports of any widespread issues using them.

        Sometimes you don't need a standard to bless your electronics to life.

    • PS5 & XBone2 have 16gb of ram in one solid block, and ports aren't getting optimized for 8gb, so some recent ports (the new Jedi game & The Last of Us notably) gobble vram on PC and don't do a good job streaming data in/out of vram resulting in stuttering even if FPS is still high.

      There might be fixes coming for both games (Last of Us has been out a while so I'm skeptical on that one) but it's going to be more and more of an issues even at 1080p, which is just nuts.
      • by fazig ( 2909523 )
        You're joining into spreading the FUD.

        The console can only address about 12GB of the RAM for the GPU. There are issues with bad implementations in those games, yes.

        One of the main issues is that players keep insisting on using texture sizes that were meant for 2160p resolutions on 1440p or even 1080p resolutions. Going from 2k to 4k textures increases the memory requirement up to a factor of 4, while a lot of people couldn't even tell the difference at smaller resolutions unless they get really close to
        • getting spanked by the 3060 12 gig in those benchmarks I mentioned. I think it's pretty obvious why.

          Consoles open up more ram to developers over time. It's an old trick to make newer games look better over time. Sony on the PS3 just came right out and said they do it. It's a way to artificially prolong console life cycles w/o requiring large increases in skill or budgets.

          It doesn't matter if 8gb is enough if developer's code to 12gb. Hell, NFS Heat ran like crap on my i5 7600. Upgrade to a Ryzen 560
          • by fazig ( 2909523 )
            I just watched the video and can't see your claims confirmed.
            For The Last of Us Part 1 at 1920x1080 Ultra the RTX 4060 Ti gets 72 average and 63 low, while the RTX 3060 gets 56 average and 51 low. In the 2560x1440 Ultra the 4060 Ti shows 50 average and 45 low while the 3060 shows 38 average and 34 low.
            And I don't see the "new Jedi game" in there. Thus I'm not even sure what you are talking about here as your own data can't back up your claims.
            Source: https://www.youtube.com/watch?... [youtube.com]

            I've seen other rev
          • by fazig ( 2909523 )
            I see, it's about the RX 7600.
            Sorry for conflating that.

            Though there are other issues when comparing NVIDIA to AMD cards in this context. For some mysterious reasons that I can't explain, AMD cards seem to have a larger memory overhead. I've seen a German media outlet cover it https://www.youtube.com/watch?... [youtube.com] but they offer no good explanation. Here and there it also pops up in the data of other reviewers when they show side by side comparisons of games with the same graphics settings where you can see
        • Yeah, if people were open to a calm discussion it would be easy to provide enough examples to get the point across. But there are a few prominent games that are held up as proof of why cards have to have at least 12 GB of video ram in order to be ready to play AAA titles and bask in their glory.

          I currently have a RTX 4080, and my last card was a 10 GB RTX 3080. Using Afterburner hardware monitor I would from time to time monitor how much video ram the game I was playing had filled up. Most of the time my ca

          • by fazig ( 2909523 )
            Yes, it's often also the simply the available processing power that keeps weaker graphics cards down in modern games.

            I used to have a GTX 1070 as well, which then didn't have enough VRAM to do proper *texture painting. Until the beginning of this year I had a Quadro P6000 (24GB GDDR5X) that I bought used for a cheap price (was most likely stolen). There I had the VRAM, but in terms of processing power for video games, that card punched in the weight class of a GTX 1080 (non-Ti). And that is something that
    • by Shaitan ( 22585 )

      There is very little reason to upgrade to the latest gen to drive a flat display unless your card multiple generations old. The reason to upgrade is for VR where even a top end 4090 is sometimes struggling.

      A flat gamer will generally outperform a VR gamer because all flat games are essentially testing the ability to use a completely unrelated interface to chain hotkeys and flick your wrist in response to a couple pixels changing, there is no authenticity to the experience. It's like playing minecraft with c

      • by HBI ( 10338492 )
        I suppose if you are interested in VR this makes sense. Personally, i'm not. I'll look forward to you alpha testers figuring out how to make it really compelling.
      • by fazig ( 2909523 )
        That really depends on the game design whether you "flick your wrist in response to a couple of pixels changing" or not.
        Most people who enjoy stuff like flight sims, being controlled by joysticks, pedals, perhaps some head and or eye tracking as well, can probably tell you that this is an oversimplification.

        I get your point that mouse inputs are often betraying whatever physical simulation a lot of games are trying to do. The mouse movement is often interpreted as so kind of zero order input where the mo
        • by Shaitan ( 22585 )

          "Most people who enjoy stuff like flight sims, being controlled by joysticks, pedals, perhaps some head and or eye tracking as well, can probably tell you that this is an oversimplification."

          More like that is a niche segment. Most people are playing various sorts of shooters and mmo's and those fall solidly within my criticism. When it comes to flight sims and the like, they are still better experienced in VR [ideally with a motion rig] but obviously the control criticism isn't as applicable. There are also

          • by fazig ( 2909523 )
            What you mean is games that are designed around mouse, touch, and similar input schemes.
            Most of such schemes rely on a graphical user interface to convey information. In those situations the user interface has been traditionally been kept simple because of technical limitations in the past. GUI is fairly performant compared to other methods and is less reliant on high quality peripherals like high resolution displays or quality input devices. This has been going on for decades and is now being partially ke
            • by Shaitan ( 22585 )

              "But what do I waste my time here on. Given the tone of voice that you use, you're not interested in discussing the finer points anyway. What you seem to want is a Star Trek Holodeck, where you have some kind of super fast 3D printer that can make objects (like food) and otherwise use force fields to give holographic projections the feeling of mass. That would certainly be cool, but for all we know for now, not possible with our technology."

              Yes that would be very cool. But reading the rest of your comments

            • by Shaitan ( 22585 )

              Let me simplify this even further. You don't need to have a holodeck for things which are taking place on an immersive three dimensional stage to be better experienced with an immersive three dimensional viewing experience and true to life physics and interactions. Controls are a nice bonus and in some specialized applications like a flight sim more or less essential but having a 3D immersive view of what in moderns games is a rendering of a 3D immersive space is better, even with the same controls. The mor

              • by fazig ( 2909523 )
                Except all the people that experience motion sickness with VR headsets when the input lags get too large and the information their vestibular system in their inner ear sends information to their brain that's conflicting with what they see with their eyes on the display.
                There are still some hurdles to overcome before it can have widespread application, which is why we won't have an "all else equal" situation any time soon. Optimization for keeping input latencies low enough are extremely important here, whi
                • by Shaitan ( 22585 )

                  "Except all the people that experience motion sickness with VR headsets when the input lags get too large and the information their vestibular system in their inner ear sends information to their brain that's conflicting with what they see with their eyes on the display."

                  This isn't really an issue on current generation hardware and for those who do experience it, it generally stops happening soon enough. Our brains figure out what is going on within a few sessions or a couple hours of play and this more or

                  • by fazig ( 2909523 )
                    Flat menus can be fine even in a VR environment IF the menus are handled like flat menus in reality.
                    In reality we have such flat menus on our phones for example where they are confined to the gadget. And that's the pivotal point there, the menu does have a representation in our environment and doesn't float in a disembodied fashion before our eyes regardless of where we look.

                    If I had to speculate the flat "meta" menus (disembodied and only existent for the user), I'd think that it's a combination of lazi
      • by noodler ( 724788 )

        Fucking hell, VR has it's fist deep up your arse.
        Why would i want to sit with a multi-pound helmet on my head for hours on end, waving my hands in the air, waiting to become sea sick?
        I mean, it can be fun, but VR becomes fucking uncomfortable pretty quickly. And most homes don't have the space for room scale.
        Never mind the lack of high quality games on VR. The best VR games are many years old by now, some almost a decade.
        For now, VR has hit a wall and isn't going anywhere.

        • by Shaitan ( 22585 )

          "Why would i want to sit with a multi-pound helmet on my head for hours on end, waving my hands in the air, waiting to become sea sick?"

          That isn't an accurate depiction of VR. Some people temporarily experience something akin to sea-sickness as a temporary thing early on but it generally passes quickly. That is why the nausea related settings are generally labeled as experience levels.

          As for the rest of what you describe it sounds like maybe you tried a bare bones unbalanced headset. Meta was able to crank

          • by noodler ( 724788 )

            Some people temporarily experience something akin to sea-sickness as a temporary thing early on but it generally passes quickly.

            I think you're underestimating the number of people who have this problem. Consider that VR is tiny compared to the general population so chances are the people that are still using it are the small group that doesn't experience these problems.

            As for the rest of what you describe it sounds like maybe you tried a bare bones unbalanced headset.

            How does 'a bare bones headset' make it that there are mostly rudimentary games on VR?
            And do you consider a Rift CV1 an unbalanced bare bones system? To my knowledge both quests are heavier than the CV1.

            and to add active cooling.

            Fucking hell, you need to add active cooling to make it work OK?

            • by Shaitan ( 22585 )

              "I think you're underestimating the number of people who have this problem."

              I don't. Quite a few people have this problem. But they don't continue to have this problem going forward. I know and have introduced quite a few people to VR, dozens and of those only one still had an issue after three sessions.

              "Like i said, most people don't have this much room available for a few hours of gaming."

              I suppose it depends on how you define "most people." Assuming we are excluding people who can't afford to game in the

              • by noodler ( 724788 )

                The actual issue is the psychological barrier erected by naysayers such as yourself using the kernel of these minor inconveniences.

                I know what i experience and i'm pretty sure you're not paranormal and can't look in peoples heads so you can shove your gaslighting up your fat arse.

                You're blaming the users and not the product. The market has spoken and people are just not as interested in VR as a small minority would like to see.

                VR/AR technology isn't going anywhere.

                Oh, pray tell, what marvel of VR software, real system seller quality, was released in the past year? Past two years? Past 3? For fucks sake, Alyx was over 3 years ago.
                Also, we were talking about VR, not AR. AR h

                • by Shaitan ( 22585 )

                  "I mean, it's OK for you to like it, but don't go telling other people what they should like and how they should spend their time."

                  I'll recommend and advocate whatever technologies to people I like. You don't get a vote. But that is rather comical advice in a discussion thread which begins with your unsolicited criticism of what I like. Also you are pretending to speak on behalf of the tens of millions (META alone has sold more than 20 million) with VR headsets when in reality VR just sucks to you.

                  As for re

                  • by noodler ( 724788 )

                    For content I can just glance at Steam and see 46 titles in New and Trending VR, all released since March of this year.

                    How does that compare to the 10000 games released yearly on steam?

                    To put it another way, there are about 30.000 monthly active users running VR games on steam.
                    There are 130.000.000 monthly active users that don't use VR on steam.

                    So after more than 10 years VR has managed to corner about 0.2% of the gaming market on steam. And that is the accessible platform with multiple choices in headsets and whatnot.

                    And don't forget, about 2 million steam users OWN a VR set. They're just not using them. So only 1.5% of t

                    • by Shaitan ( 22585 )

                      "How does that compare to the 10000 games released yearly on steam?"

                      Is there some reason it needs to?

                      "So after more than 10 years VR has managed to corner about 0.2% of the gaming market on steam."

                      After more than 10 years? The quest 2 release which signaled the START of VR as a potentially viable general consumer technology happened in Oct 2020. It has been two and a half years and it was more or less released as promise which wasn't significantly filled until updates rolled in over the next year and a hal

                    • by noodler ( 724788 )

                      After more than 10 years? The quest 2 release which signaled the START of VR as a potentially viable general consumer technology happened in Oct 2020.

                      Clearly you're being selective here. You don't even mention the quest 1, which facebook dropped like a rock and left many people jaded.
                      Anyway, the current VR hump, including quest 2/pro/3, started with lucky palmer doing his kickstarter over 11 years ago. Everything you see today is part of that hump.
                      You're actually trying to pull a 'true scotsman'. 'No, no, this time it's the REAL VR. This time, it will be better.
                      Dream on.

                      For a technology that a sane person would expect to take at least 15 years to phase into mainstream like other similarly revolutionary technologies that is actually fantastic.

                      Blah blah, more excuses, more blah.
                      You just have to look at the adoption rate to see

    • Oh come on.

      Some of us have been using "1440p" for over a decade. 2560 x 1440 resolution was "state of the art" when the Apple LED Cinema Display came out in 2010. Since then we've seen ultra-wide displays launch and become popular that are 3440 x 1440, as well as 4k / 5k resolutions. Laptops that aren't low-rent trash will usually be at this resolution as the cheaper option to an available 4k panel.

      Yes, there is still a lot of 1080p out there, but you don't exactly have to break the bank to find a displa

  • I've been rocking a 1080ti with 12GB RAM for almost 5 years on my 8K display (Dell Ultrasharp 43" 3840 x2160) and haven't had a game balk yet. You can find those for $200 on eBay. NVidia shot themselves in the foot with that card, in my opnion.
    • A 1080ti isn't a good bargain at $200 anymore. For $80 more I could get a brand new RTX 3050. It might be a little slower and have only 8GB of RAM, but it also has ray tracing capability and uses half the electricity.

      • Barely being able to do ray tracing isn't a compelling argument, and having less VRAM is a total non-starter as the amount needed is going up, not down. The electricity would be relevant for mining, but not too significant for most other uses. Nobody should buy a xx50 nvidia card.

        • I know I'm repeating another comment I made elsewhere, but a 6 year old card with a 250W TDP and the heat wear that comes with it just isn't worth it. Any 6 year old device with an MSRP of $700 going for $200 after all this time is just not that good of a deal. Finally, at least - the NVIDIA problem pricing/availability problem is better than it was. If you care about VRAM, then you can already get a used 3060 ti for about the price of a new 3050. And it has 12GB and still a lower TDP than the 1080i.

          I ju

          • I know I'm repeating another comment I made elsewhere, but a 6 year old card with a 250W TDP and the heat wear that comes with it just isn't worth it.

            Most of the time the card has plenty of life left if you just refresh the thermal paste.

            All I care about is having no issues running 10 year old games at decent settings and being future proof for a bit.

            Then getting a card with less than 12GB is a very bad idea, not for the first reason, but for the second.

            • Plenty of life, sure, but it's not a real argument for buying something that old when something nearly new is less than 50% more. And driver updates are likely to continue into Windows 12 among other age related issues.

      • >Slower
        >Less RAM
        >b-but raytracing!
        Whew the propaganda is strong with this one.
        Nah, the shittier card that costs more money isn't somehow better, nice try Jensen but I'm not buying it.
        • I wouldn't be buying a $700 card used for $200 when I can buy new for a little more. Especially one that requires a new power supply at 250W. I don't care about ray tracing myself. I also don't care about upgrading.

          I'm sure I'll be able to find this newer card used for under $200 before long. If I was going to buy used it would probably be a 3060 ti for barely more than that 1080 ti. Even that is available for around the price of a new 3050 and with 12GB of RAM, less wear, and a lower TDP than a 1080 t

      • for $150. That's $130 cheaper than a 3050 and hangs with it quiet nicely.

        Old flagship cards like the 1080 ti always command a premium because there's always somebody out there that wants to build their ultimate PC from x years ago. The Titan still sells for a pretty penny even though my rx580 out performs it.
    • by Junta ( 36770 )

      The 1080Ti is going to fail harder at rendering than this 4060ti. The key thing is definition of 'fails to push the bar'. The 4060Ti is likely to compare favorably to a 1080Ti at rendering even 4k.

      The benchmarks show even the current 8GB configuration putting in about 40 fps average for what are probably fully turned up graphics quality settings. A lot of review sites will nowadays rant on how anything under 60 fps on lowest 0.1% performance on max settings in 2023 games is just an unbearable nightmare.

      • The 1080Ti is going to fail harder at rendering than this 4060ti. The key thing is definition of 'fails to push the bar'. The 4060Ti is likely to compare favorably to a 1080Ti at rendering even 4k.

        Well it's going to be slower than 4060Ti but also cheaper (assuming $200 is correct). The 1080Ti is around 3060 level, which is what the 4060 is gonna be like, by the looks of it.

        https://www.tomshardware.com/r... [tomshardware.com]

        AMD's not very competitive and Intel's stuff is still a bit broken so it seems that there's no actually good option now and everything still sucks.

    • 1. 8k is not 3840 x 2160. That's 4k / UHD.
      2. a 1080ti is not capable of running even 3 year old titles at 4k with everything turned on and giving you a framerate above ~30fps unless you are playing a game that just doesn't use the GPU very much. Try playing something like Cyberpunk 2077 with everything turned on except ray tracing and DLSS (since your card doesn't support those) and you'll see what I mean.

      So basically everything you said is false or ambiguous.

  • "What this country needs is a good five-cent sports car."
  • by Petersko ( 564140 ) on Wednesday May 24, 2023 @10:08AM (#63547979)

    I don't get it in that if the game you're playing has any level of intensity, you're too busy trying to not get killed, avoid steering off a cliff, or manage your peons to notice the photorealistic ripples on a lake that's briefly in your view.

    I get it in that the technical hunt for peak performance is intrinsically satisfying. Even if you subtract out the "end state" of actually playing a game at top settings, the chase is still fun. Messing with water coolers, tweaking dram settings, pushing the limits... for a lot of people that's the fun.

    But for any action at all, even on a really big monitor the difference in your gameplay experience just doesn't seem significant to me.

    • 1080p ought to be enough for everyone.... No, I mean it!
      • I like to play splitscreen games with my son sitting near the TV and 1080p looks like crap.
        • So... did you have fun? Sounds like quality time. Or did you just complain about the resolution to him?
          Have fond memories of playing micro machines split screen on a 640x480 resolution.
          • Yes, it is fun. We've played a lot of Rocket League and my 1080ti plays those simple graphics at 4k surprisingly well. Or, F1 2020 (I think that's the year... they dropped splitscreen in more recent ones) can play 1440p if you turn down things a little.
        • Nobody's saying you're stuck with 1080p. But if you turn the graphics from Ultrahigh down to High (or... *gasp* medium!), you probably won't notice the difference unless you freeze frame and go looking for it.

    • I remember thinking this about playing at 1024x768 vs 800x600. There's much room for improvement between 1080p and retina display.
    • But for any action at all, even on a really big monitor the difference in your gameplay experience just doesn't seem significant to me.

      I imagine many people would agree with you. On a conscious level, none of the environment matters to the actual gameplay.

      So then why are games pushing towards photo-realism?

      Because while you may not consciously be aware of the beautiful scenery, the rest of your brain absolutely is aware of it and it feels different... dare I say 'better'?

      Street Fighter. Fun game for some. Very simple... and yet they made dramatic backgrounds for you to fight in. Why?

      Just because you are not consciously aware of the effect,

  • by pavon ( 30274 ) on Wednesday May 24, 2023 @10:46AM (#63548069)

    Sure, the RX 7600 is basically is on par with the RX 6650 XT, and doesn't really push the midrange significantly forward. But it is launching at an MSRP that is $130 less than the RX 6650 XT launch price. Attracting customers that skipped the last two generations due to price is more important to the video card manufacturers than providing an increase in capability for those who purchased a card recently, especially in the mid-range.

    • A simpler way to look at that: this is a low-end card for a mid-range price. Of course it isn't going to be able to do anything fancy, you don't expect that from low-end cards. We haven't gotten to the point where GPU prices have returned to sanity, and we probably won't be for some years to come. This might be a good time to rediscover console gaming.

      Or not. Developers know that people aren't upgrading, so most games haven't really been pushing the hardware very much anyway.
  • by Berkyjay ( 1225604 ) on Wednesday May 24, 2023 @10:51AM (#63548087)

    What the hell does that even mean?

  • by lpq ( 583377 ) on Wednesday May 24, 2023 @10:59AM (#63548105) Homepage Journal

    Sadly, I found that a brand new (2023 model) Samsung "smart" display wouldn't support a 1440 display. My old Samsung monitor auto sized it to a compatible resolution, but trying to display 1440p content causes the display to continually reset. I'd even asked if it supported auto-resizing of lower display levels on Amazon's questions and was told it would support upscaling of lower display resolutions to to 2160p. It supports upscalling 720, 1080, 640 and 960, but not 1440. Very annoying.

    When I contacted samsung support, I was told 1440p was a not a standard HD display size only 720, 1080p and 2160p, and that they would not be fixing the problem.

    Very disappointing.

    • To be fair to the entire industry, pointing to a Samsung product for not supporting something is like going to a school of special needs children and using them to judge the general intelligence of society. I have many Samsung products. What I don't have is a Samsung product without some infuriating stupid bug, stupid support decision, stupid excuse for why their product suddenly stopped doing something with a firmware updated, etc, etc.

      Samsung is a garbage company when it comes to software or firmware, and

  • It is _gaming_, not CAD or editing small text (which both do not need the FPS that gaming needs). FullHD is entirely enough.

    • It's always best to play a game at your monitor's native resolution, or at one quarter of that. If you get a high resolution monitor for productivity purposes then you might want to be able to play your games at that resolution as well.

      Though I believe that there have been some significant advances when it comes to scaling, so maybe this isn't as true as it used to be.
  • by Babel-17 ( 1087541 ) on Wednesday May 24, 2023 @11:11AM (#63548139)

    nVidia took a chance with its last generation cards, but everything worked out well for them as they got both a good deal from Samsung using their less than cutting edge foundry, and managing to work with Samsung to get a nice balance of good yields, good performance, and good power efficiency, even as they had already gotten a sweet price per wafer.

    That enabled nVidia to sell its chips to its partners for a reasonable price, and at what would have been a good deal for consumers had not mining been a thing. The $700 MSRP for the RTX 3080 was widely thought to be taken with a grain of salt, and it was expected that the AIB partners would sell their versions starting at $800. I guess they did, but I gave up trying to score one, and eventually got mine as part of a build from Cyberpower.

    Mining isn't a factor anymore, but people are forgetting that the world has been awash in currency since the COVID-19 epidemic, and that has lowered the value of currencies through inflation. The process node nVidia is using is expensive, and while we are benefiting from a much better frame per watt ratio (my undervolted RTC 4080 uses less power than my undervolted RTX 3080 did), we are paying a steep premium for a much in demand node having been used.

    AMD saved a few bucks by combining two less in demand nodes, but even there they weren't cheap, and they too have had to adjust for inflation.

    Well, "better days are coming" imo, and my guess is that the next generation will be much more about affordability. It will still be a gaming industry dominated by the presence of the Play Station 5/Xbox Series X generation, and as 8k monitors and TVs aren't expected to be much of a thing for several more years, the bar that nVidia and AMD will have to meet to satisfy gamers looking to play their console ports on 4k (or lower) panels, will be relatively easy to get to, even with mid-range video cards.

  • by Artem S. Tashkinov ( 764309 ) on Wednesday May 24, 2023 @11:20AM (#63548167) Homepage

    Why is yesterday's 1440p card suddenly a 1080p one for Nvidia?

    1. Monopoly/duopoly
    2. Greed
    3. Gimmicks (DLSS)
    4. "It's good enough people won't realize we're charging top dollar for crap". 8GB of VRAM is horribly insufficient for a number of modern games at 1080p - fast forward a few years from now - people will have to dial down texture quality settings just not to get horrible stuttering and abysmal framerates in low single digits.

    • It's good enough people won't realize we're charging top dollar for crap

      They aren't. They are charging bottom dollar. It's the "budget" card. No one is buying this for their fancy 1440p gaming rig. Actually in reality it's the opposite, people are spending much more on higher end graphics cards for the 1080p gaming rigs.

      NVIDIA was wrong to market the previous lower end card to the 1440p crowd in the first place. It was never a good idea, and one that didn't drive sales.

  • Tone deaf (Score:4, Insightful)

    by CAIMLAS ( 41445 ) on Wednesday May 24, 2023 @11:32AM (#63548195)

    I don't get it - this review is tone deaf.

    They release discount cards - some of the first we've seen in a number of years - and they're available - also a bit of a new thing.

    Meanwhile, people buying discount cards don't typically have 1440p monitors - they're kind of an odd resolution, IMO. I'm only buying something that small, at such a high resolution, if I'm only using the screen for gaming. That's not a "budget" decision.

    • All(?) TV's are 4k, and 1440p gaming on a 4k screen looks very good (much better than 1080p, even though that's an even multiple of 4k res).
      • by CAIMLAS ( 41445 )

        WQHD/1440p is a decidedly "workstation" resolution, though. They aren't common monitors, and they have not traditionally been good for gaming - scaling 1080p to them looks weird, or cuts the edges off.

        I don't have a 4K TV, myself, so I can't comment there. But 16:9 ratio stuff has been the standard for gaming for so long, that most game textures, etc. look off on a wide format display.

        • by EvilSS ( 557649 )

          WQHD/1440p is a decidedly "workstation" resolution, though. They aren't common monitors, and they have not traditionally been good for gaming

          1440p is a very popular resolution for gaming monitors. It's considered the current sweet spot and 1440p monitors are plentiful.

        • 1440p has been a thing for at least 13 years, and basically every game title worth playing out there supports that as a native resolution, and has for quite some time.

          There aren't "workstation" resolutions. And 1440p has been a common display format since Apple shipped the LED Cinema Display in 2010, and Dell / HP / Lenovo / Samsung / LG started using the same panels in their 27" displays until 4K came along and hit price points that aren't stupid.

    • by Junta ( 36770 )

      Well, more to the point, the 'This isn't a 1440p card, this is a 1080p card!' is kind of a silly arbitrary statement.

      It can render 4k or 720p or 1080p or 1440p. For some level of detail it will deliver some level of performance at each resolution. From some reviews, it looks like it can run a fair selection of games at 'max' (relative to each game) at 40-60 fps even at 4k.

      It can do 4k better than a '4k' GPU could do 4k 5 years ago.

      I think most of the grumbling is the wide gap between the 4060TI (which does

  • Runs at bargain bin resolution

  • Sure the resolution is better, but what actual advantage does that give me? I've watched a few videos, looked at some comparisons, and it seems like the vast, vast, majority of the time the visual difference between 1080p and 1440 the difference is unnoticeable. Hell the visual difference between 1080p and 4k is barely noticeable most of the time! Yes, there's a great deal more data, and it can be worth recording in 4k, but actually watching something? If you're gaming on a TV, or using a 32" monitor, I cou

    • It depends on what you're playing, what settings you're using, and how much practical draw distance you have.

      Some 25 years ago I had a Cornerstone CRT with a .22 pitch and some kind of then-silly-high resolution, and if I turned down my settings enough to play games on it then I could literally see (and shoot) people at greater distances than those at which they could see me...

  • When HDTV came out, we had 720p and 1080p resolutions (on broadcast, it was only 1080i, but the TVs could mostly handle a 1080p input signal (eg. from a Blu-Ray player or PC)). This meant that not only did 1080p content have to be downscaled to be viewed on a 720p TV, but 720p content had to be upscaled to be viewed on a 1080p TV. Because 720 is not a factor of 1080, it meant that each pixel of a 1080p display does not correspond to an integral number of pixels in a 720p display, so some blurriness might oc

  • If it ain't broke don't fix it - in that sense, I wish Nvidia would just do a die shrink of the GTX 1080 to achieve half the original TDP, but keep or improve the performance of the original, and then sell it for a good price.

    Regarding 2560x1440, it's actually a great resolution for gaming at high frame rate and with a lot of details. First, because there are fast monitors available, such as Samsung Odyssey G6 or G7 VA at 240Hz, LG 27GR95QE-B OLED at 240Hz, or Asus PG27AQN IPS at 360Hz. Second, many slightl

Time is the most valuable thing a man can spend. -- Theophrastus

Working...