Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Businesses Intel Hardware

Intel Announces Price Cut for 9th Generation F and KF Processors (anandtech.com) 30

An anonymous reader shares a report: One of the interesting developments of Intel's 9th Generation Core processors for desktops, known as the S-series, was that the company decided to release versions of the hardware with the graphics disabled in order to use every chip from the wafer. At the time Intel was criticised on its pricing: it was offering the same processor minus graphics for the same bulk unit cost, with no discount. Today Intel is adjusting its strategy, and pricing these F and KF processors lower than before. Nearly every 9th Generation Core processor for the desktop has a corresponding graphics-free option: the Core i9-9900K has its Core i9-9900KF, the Core i5-9500 has a Core i5-9500F. The difference between these two parts is just a matter of disabled graphics, which means the user can't take advantage of Intel's QuickSync or a display, however most of these processors end up in systems with discrete graphics cards anyway. At the time of launch, Intel priced them identically to the parts that did have graphics, but ultimately retail outlets were selling the K and KF processors at a small discount. Intel's announcement today makes that price difference official.
This discussion has been archived. No new comments can be posted.

Intel Announces Price Cut for 9th Generation F and KF Processors

Comments Filter:
  • Comment removed based on user account deletion
    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Monday October 07, 2019 @11:14AM (#59278700)
      Comment removed based on user account deletion
      • The only people who are still buying these are the people who have no choice, or are very bad math, and can't read reviews either.

        So most of the market then?

        The average consumer still isn't very tech savvy, but they at least know a few of the specs they should be looking at. I still see some people get tripped up with memory because RAM and hard drive memory are synonymous to them.

        If you've ever wondered why Apple hides most of this information, it's not just because it doesn't always compare favorably to competing products as a raw number.

      • In my country the 9400F is only 2/3rds the 9400 making it better value than a 3600 ryzen in terms of performance per dollar.
    • by DRJlaw ( 946416 )

      take a quad core and it fails QC for two cores, then make it a dual core and still sell it. now a APU GPU CPU FPU minus the GPU. no biggee. love the price cuts. will take the CPU only any day all day for a deep deep discount.

      Given all the thermal and power management features of the chips, you receive all the same benefits -- powered-down GPU section -- simply by not hooking up a monitor to any of the onboard graphics outputs.

      So, will you take it all day for no discount? (Intel tray price)
      Will you take it

    • by nagora ( 177841 )

      Yep: the Fucked and Kinda Fucked lines.

    • by tlhIngan ( 30335 )

      this is nothing new.. take a quad core and it fails QC for two cores, then make it a dual core and still sell it. now a APU GPU CPU FPU minus the GPU. no biggee. love the price cuts. will take the CPU only any day all day for a deep deep discount.

      It's not even that - sometimes you're going to design a part to have several different use cases, so you build in the ability to disable things. Sometimes it's for binning (bad GPU - GPU less chip, or bad cache line, hence the Celeron processors, etc). Other times,

  • intel needs cut pricing to keep up with AMD!

    • Big ad campaigns and price cuts are both signs of failure.

      Intel is failing at CPU design, and also failing at process technology.

      Maybe they can get the Israelis to bail them out again with another superior design.

      • by Anonymous Coward
        Yes, Intel is *obviously* failing at CPU design. I mean it's not like the vast majority of the world uses their processors for both servers and desktops.
        • Yes, Intel is *obviously* failing at CPU design. I mean it's not like the vast majority of the world uses their processors for both servers and desktops.

          The vast majority of the world does stupid shit contrary to their own interests every day. So?

      • by jwhyche ( 6192 )

        Intel is not failing at CPU designs, and nether is AMD. AMD star is currently rising and Intel is currently setting. But this is nothing new. I've seen it over and over for 2 decades. In a few months Intel will come out with a design and their will start to rise again. Rinse and repeat.

        • In a few months Intel will come out with a design and their will start to rise again.

          Intel has been rehashing the same architecture for ages. AMD has brought out three new architectures in the same period, and all of them offer better price:performance ratios than Intel in spite of traditionally having been at a disadvantage in process technology. Now that they have approximate parity of feature sizes, AMD is beating up on Intel.

          When was it again that Intel planned to have an architecture out that doesn't need performance-compromising microcode fixes to be almost as safe as AMD when it come

          • by jwhyche ( 6192 )

            Fanboys give me such headaches.

            AMD has not brought out three new architectures in the same period. What they have come out with one new architecture and two refreshes in the same period. It is the exact same thing Intel has done with the Skylake architecture. A new architecture as you implying is years and billions of dollars in research. An lets be honest here. AMD is doing good but financially they have no where near the deep pockets that intel has. I doubt that AMD has the cash at hand to do 3

            • I assume he meant during the time Intel has been on Core (and it's derivatives). Core dropped in 2006. During that time, yeah, AMD has introduced new architectures. They weren't any good (although I heard Clawhammer was secretly popular with streamers), but AMD was trying.

              BTW, AMD isn't sitting still. By the time Intel un-fucks its self (2020? 2021?), AMD will be on Zen4 architecture, with 256 core server chips.
              • by jwhyche ( 6192 )

                Incorrect. The last architecture from was the Skylake rebuild in 2015. Which is getting on 5 years old now and should be due to be replace soon with a new core design. In 2020 I fully expect AMD to be coming out with a Zen4 REFRESH of the Zen architecture.

                Sure the Eypic server processors are beasts. 64C / 128T with 256 PCI lanes at $7,800 bucks. Over $2,000 shekels cheaper than Intels closest Xeon. Problem is, nobody is buying them. According to this article Eypic is roughly 3.2% of the server

  • by Retired ICS ( 6159680 ) on Monday October 07, 2019 @11:20AM (#59278746)

    This saves the added step of disabling the Intel Video in the computer BIOS.

  • While there obviously is a huge business/budget PC market that will benefit from having an integrated GPU, there is also a considerable market that does not care about it or want it in there as it will never be used. Yields make such a big impact on their bottom line it is a wonder why they've never bothered to make a second line that omits the GPU entirely, not merely using defective units by disabling the GPU, so as to get a much higher unit count from every wafer. Actually get rid of the GPU from the die

    • by Holi ( 250190 )
      Because tooling for a complete separate sku is incredibly expensive, it's not like you can quickly switch the kinds of chip your creating. It takes time and lot of money to setup your fab to produce a specific silicon.
  • I haven't been there in 10 years so memory is fuzzy and things may have changed, but....

    When I was there an entire family of chips would have the same silicon. The silicon had fuses that could be burned at the factory to disable different features permanently. If you buy the low end version you got the same silicon as the high end version, except fuses were blown to permanently disable certain features.

    If memory serves each silicon version had 8-32 fuses.
    • It's not a new idea with computers and computer parts. Hell, even back in ye olden days with mainframes they were largely the same inside the box and how much you paid determined how much was actually activated inside. Need more processors? Call up IBM and they send a tech out with a wirecutter to cut a couple of disabling connections and poof! more processors.

    • by Megane ( 129182 )

      Another example: the common "blue pill" $2 microcontroller board from China uses an STM32F103C8T6 chip. Officially this only has 64K of flash (that's what the "8" means), but in actuality they all have 128K of flash, it's just not guaranteed. But this part has been made for so long that it has probably had perfect yield for many years. My guess is they have a fuse that simply swaps the top and bottom half of flash if they find a defect in the first half, and then test the second half. It's called binning. [wikipedia.org]

      T

  • Go, AMD.
    Real competition in action !
    Nice to see they survived the 'socket' compatibility disaster !
    Gonna' be interesting to see what Intel comes up with to deal with the Ryzen challenge.

  • ...ALL the problems with their flawed execution of prefetch logic.

    There are a dozen problems now known with EVERY available Intel processor, and when you add the fixes to attempt to make them secure, the performance is hit by a 20-30% amount.

    The same flaws exist in every processor since 2011; it kinda shows how important that fixing that is.

    And for the peeps that want to jump in on the "But AMD..." bandwagon, their processors are NOT susceptible to the most aggregious fails, and the rest require machine acc

Somebody ought to cross ball point pens with coat hangers so that the pens will multiply instead of disappear.

Working...