Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Intel Hardware

Intel and AMD May Both Delay Next-Generation CPUs 193

MojoKid writes "AMD and Intel are both preparing to launch new CPU architectures between now and the end of the year, but rumors have surfaced that suggest the two companies may delay their product introductions, albeit for different reasons. Various unnamed PC manufacturers have apparently reported that Intel may push back the introduction of its Ivy Bridge processor from the end of 2011 to late Q1/early Q2 2012. Meanwhile, on the other side of the CPU pasture, there are rumors that AMD's Bulldozer might slip once again. Apparently AMD hasn't officially confirmed that it shipped its upcoming server-class Bulldozer products for revenue during August. This is possible, but seems somewhat unlikely. The CPU's anticipated launch date is close enough that the company should already know if it can launch the product."
This discussion has been archived. No new comments can be posted.

Intel and AMD May Both Delay Next-Generation CPUs

Comments Filter:
  • Collusion (Score:4, Interesting)

    by parlancex ( 1322105 ) on Saturday September 03, 2011 @05:41PM (#37298856)
    There might be good reasons on both sides, but the tinfoil hatter in me believes this might have more to do with fact that both companies might want to see a little more profit out of the R&D that went into the current generation of products before obsoleting them. The performance of the current generation is high enough that it is getting harder to introduce a new generation at a price point that could both recover R&D and provide reasonable value for the customer.
    • Ya right (Score:4, Interesting)

      by Sycraft-fu ( 314770 ) on Saturday September 03, 2011 @05:54PM (#37298930)

      The current situation is Intel is slaughtering AMD. AMD hasn't had an architecture update in a long, long time and it is hurting them. Clock for clock their current architecture is a bit behind the Core 2 series, which is now two full generations out of date. Their 6 core CPU does not keep up with Intel's 4 core i7-900 series CPU, even on apps that can actually use all 6 cores (which are rare). Then you take the i5/7-2000 series (Sandy Bridge) which are a good bit faster per clock than the old ones and there is just no comparison.

      On top of that, Intel is a node ahead in terms of fabrication. All Sandy Bridge chips, and many older ones, are on 32nm. AMD is 45nm at best currently. Not only does that equal more performance but it equals lower heat for the performance, particularly for laptops. Then of course Intel is talking about Ivy Bridge, which is 22nm, another node ahead. Their 22nm plant is working and they've demonstrated test silicon so it will happen fairly soon.

      The situation is not good for AMD. All they've got is the low end and that is getting squeezed hard by Intel too. They need a more efficient CPU and they need it badly. Delaying is not something they want to do, Bulldozer has been fraught with delays as it is. They've been talking about it for a long time, like since 2009, and delivered nothing.

      They have every reason to want to get Bulldozer out as soon as possible and preferably before Ivy Bridge. Each generation that Intel releases that they don't have a response for just puts Intel that much farther ahead.

      Now that said, Intel may well have decided to hold Ivy Bridge if AMD can't deliver Bulldozer because they don't need to. Sandy Bridge CPUs are just amazing performers, they don't need anything better on the market right now. However I can't imagine AMD colluding with Intel on this. They are not in a good situation.

      • Comment removed based on user account deletion
        • I think the Pentium 4 days really f'd over AMD in the OEM space, though since the Core/Core2 Intel's held the crown, though for most use AMD's E-350 is a really nice offering. I think Bulldozer will have advantages in the low-mid end, where Intel may keep the raw cpu speed crown. I'm more interested in seeing an NVidia Tegra 3 in laptop/desktop options myself. I think we've gotten so used to the bigger, better cycle that we've lost sight that 5+ year old tech is more than fast enough for anything most pe
          • I have to say that so far I'm impressed with the E-350 in my new not quite netbook. For where it sits with price/performance/battery life there weren't any portables with Intel solutions I could consider. I don't have the patience to put up with the Atom anymore and their larger budget processors are assembled in an indecipherable mess of product lines and model numbers.

            Intel's problem that everyone has been sounding the alarm on is that in the coming years being the x86 people with court mandated competiti
      • Re:Ya right (Score:4, Insightful)

        by laffer1 ( 701823 ) <luke@@@foolishgames...com> on Saturday September 03, 2011 @06:12PM (#37299038) Homepage Journal

        I don't believe the bit about 32nm is accurate. I just ordered a new laptop with an AMD A6-3400 CPU. This is a fusion based chip and is 32nm.

        As far as the performance claims regarding 6 core AMD chips, I have to agree with that. However, the cost of an Intel chip is not worth it. My 6 core AMD upgrade saved me hundreds of dollars. it still improved my starcraft 2 framerate by double over my phenom 9600 x4.

        Intel stuff is faster if you have the money. It's not fanboyism, just practical price/performance based on benchmarks.

        • I bought a nice 6 core Phenom X6 x1035T. It is underclocked to only 2.6 ghz, but for $450 I got 8 gigs of ram and virtualization to run VMWare with its SSD instructions. With the 6 cores and 8 gigs of ram it rocks to have 3 - 4 VMS running for the price I paid.

          With an ATI 5750 that came with it, games run reasonable well too. As soon as I upgrade the PSU I plan to flash the bios so I can clock my cpu to 3.2 ghs. Asus crippled but there are hacks to get around it.

          For value, the AMD phemom II is only 4-7% slo

        • Intel core-i5-2310 Sandy Bridge - $190.
          AMD Phenom II X6 1090T Black - $170.

          That's a $20 difference and BTW the i5 blows away the Phenom (any Phenom). You don't even need an i7.

          Intel is able to price their cpus at a bit of a premium over AMD, which is why Intel is rolling in money and AMD is not. But there's a good reason why Intel has that pricing power and its one word: "SandyBridge".

          It is also true that the absolute highest-end unlocked Intel cpu is priced at a very serious premium... but if you are try

          • You also have to consider a few other things, integrated graphics options, especially in laptps, and/or motherboard options. This is where the pricing really favors AMD. For my mom, father, brother, sister, and grand, others an E-350 based system is sufficient, and other integrated gpu options from amd all exceed intel. This is why I've gone about half and half on even recent builds often in favor of amd. Total system cost for a non-suck system (sub-$750) really favors AMD. I think where AMD needs to f
            • by Shillo ( 64681 )

              I've recently switched from all-nvidia to AMD GPUs and was pleasantly surprised when the drivers horror story just didn't happen. Aparently the monthly release cycle did wonders for them, both on Windows and Linux.

              • It's gotten a lot better, however, for instance 3D acceleration for the E-350 took a couple months... I don't run linux as my host OS except on my server, so rarely an issue.
          • by laffer1 ( 701823 )

            Let me be clear, when I was talking about price, I was including the fact that I upgraded an existing system with a 6 core CPU rather than buying new RAM + motherboard + CPU to switch to an Intel chip.

            There are workloads that AMD chips beat Intel chips. The benchmark mentioned by the other poster is an example. One benchmark does not prove anything, but I'm certainly happy with my purchase.

            What I like most about the AMD CPU is that it's great for building packages. I use it occasionally on the MidnightBS

        • for something more useful than Starcraft. I agree you don't need 6 cores and 4G ram to read your e-mail, but today the workload of the average server or desktop, includes running virtual machines, virus scanners, full encryption, flash websites and whatnot. The laptop I was "given" 2 months ago has a brand new 4 core Intel, 4G ram, Nvidia quadro GFX and it's too slow to run my normal workload of terms, browser and VMs. Given the fact that I'm a contractor, spending a little extra on a faster CPU would proba
          • For servers, almost everything is running on VMs now. More power per CPU is very welcome there, since you can run faster/more VMs per box. The less heat you produce, the more servers you can put in a data center. Given the cost for real estate at prime interconnect sites, it's profitable to go green, even if you're not a tree hugging hippie.

            Hm, actually, if you're going to have a pile of heavy-duty VMs running concurrently, a higher number of slightly less powerful cores are going to be much better than a

          • by Shillo ( 64681 )

            for something more useful than Starcraft

            If you think you need more CPU for desktop than for gaming, you are doing something seriously wrong, no matter how many VMs you use. Seriously, check the hardware requirements for, say, Starcraft 2. It totally owned my machine before I last upgraded. The same machine that practically flies for development, VMs and computational fluid dynamics. Yes, it uses on-access virus scanning and W7.

            4G ram

            There's your problem. VMs are big. Swapping to hard disk is slow. More CPU won't help. You need more RAM.

            Either that or y

            • by Targon ( 17348 )

              It depends on the games you are looking at. Most of the first person shooters are highly GPU focused, with very little AI involved. Much of this is due to this idea of releasing the same game for consoles as well as PCs, they go for the lowest common denominator, and that generally means low end CPU and even the graphics tend to avoid being cutting edge to make the PC version almost identical to the console version.

          • by smash ( 1351 )

            If you're running numerous VMs, RAM is your problem. Given enough RAM, a core 2 duo will run several VMs in a test environment just fine. With 4GB ram, you can throw the fastest CPU at it you like; if you start running into swap, you're fucked.

            In fact, that goes for almost every non-gaming task you'd use a box for these days, other than transcoding video and a few other CPU bound niche tasks.

            • by smash ( 1351 )
              Oh, and in the server space for VMs, RAM is still the limiting factor. My little work cluster has 128gb of RAM, with 32 xeon cores. CPU is probably running at 15-30%, RAM however, is getting tight.
        • Comment removed (Score:4, Informative)

          by account_deleted ( 4530225 ) on Sunday September 04, 2011 @11:28AM (#37302852)
          Comment removed based on user account deletion
          • Intel's compiler is hard to beat, and so a great many people use it. The fair test would be the intel compiler on Intel, and a compiler of AMD's specification on AMD's chips.

            • by smash ( 1351 )
              Actually the "fair" comparison would be to use the compiler most third party software out there uses.
      • by malkavian ( 9512 )

        People don't really care about clock for clock these days.. The kicker is in energy efficiency and cost for performance.
        AMD do reasonably well in the energy efficiency and really well in the "Bang for Bucks" department. Yep, Intel currently outstrip them on the high end, but AMD have a lot of the mid to low range market, and still have a good showing in the server market. I wouldn't exactly call that 'getting slaughtered'..
        Still, as you say, they do need to get newer architectures out the door to keep bei

        • by 0123456 ( 636235 )

          AMD do reasonably well in the energy efficiency

          Where? Every benchmark I've seen puts AMD well behind the i3 and i5 in performance/power and the idle consumption of the i3 and i5 isn't much worse than an Atom.

          • To an extent yes, but Atom sucks, I mean seriously, Intel ought to have been too embarrassed to let that dog see the light of day. And yes, the Intel offerings do offer better battery life, but at a cost, the only ones I looked at were several hundred dollars more. Battery life is great, but with that much extra on the price tag you might as well just buy a couple extra batteries.

            • Well, for sure I wouldn't have put the founder's name on the Moorestown product. That's just asking for trouble.
            • The Atom was a quick and dirty way for Intel to get into the tablet and netbook business as that market was taking off. It's only reason for existence is that it is an x86 processor, and thus can run Windows. Except to see sales drop off rapidly when Windows 8 comes out with ARM support.
              • by HiThere ( 15173 )

                The trouble with MSWindows 8 on ARM (if you even consider MS an option) is software from other companies. It won't run. It won't even install. (That's a prediction, not tested. But a reasonable bet.)

                I don't expect MSWindows8 on ARM to have ANY effect on desktop sales, even at the extreme low end. It may be fine for phones, where there *aren't* any legacy problems. (Not the way I'd bet, but a possibility.) But I don't see any possibility for it on a general purpose computer. Going to run an X86 emula

                • by smash ( 1351 )
                  This is why microsoft has been pushing the .net virtual machine i am guessing. JIT compilation + virtual machine = CPU agnostic code = freedom from dependence on x86.
      • by IorDMUX ( 870522 )
        Well, the article basically said that there is no reason to believe that Bulldozer is delayed at all. I dunno why the title reads "Intel and AMD may both delay"

        ... wait. Yes, I do. To get readers.

        From the article:

        The CPU's anticipated launch date is already close enough that the company should already know if it can launch the product or not; waiting until now to announce a delay isn't something Wall Street would take kindly. Moreover, AMD has been fairly transparent about its launch dates and delays ever since the badly botched launch of the original K10-based Phenom processor back in 2007. Llano has been shipping for revenue for several months, and we're not aware of any 32nm production troubles at GlobalFoundries.

      • by Kjella ( 173770 )

        On top of that, Intel is a node ahead in terms of fabrication. All Sandy Bridge chips, and many older ones, are on 32nm. AMD is 45nm at best currently.

        No, the Llano chips are shipping and 32nm SOI. However only the low power Bobcat cores are made on that process, they need the high power Bulldozer cores to compete with Intel on performance. But yes, AMD ships very many 45nm chips still.

      • The current situation is Intel is slaughtering AMD.

        Then why do I only buy AMD (and ARM) these days? Frankly, Intel just seems to fib about their power envelope every generation and I do not, repeat, do not like to be surrounded by noisy computers. Currently running a quietized 4 way Phenom II box, very happy with it. I have not been happy with any intel box as a workstation for quite some time. Nothing beats the Pentium M in my aging Shuttle for a basically silent server (21 db @ 3 meters). Every other Intel box I have run recently requires stupid amounts o

        • by 0123456 ( 636235 )

          Then why do I only buy AMD (and ARM) these days? Frankly, Intel just seems to fib about their power envelope every generation and I do not, repeat, do not like to be surrounded by noisy computers. Currently running a quietized 4 way Phenom II box, very happy with it.

          You'd have been happier with an i3 or i5. I can just hear the fans on my i5 server when I stand with my ears a few inches away from it.

          • by Rich0 ( 548339 )

            Not at the same price point. For what he'd spend on the i3/i5 he could probably have water cooling or something else that is rediculous. That is the thing these kinds of comparisons always leave out - cost. I've been running AMD systems for a while now - I can upgrade a box every two years when buying Intel would mean I'd be upgrading them every four years. While in years 1-2 the Intel system would be somewhat faster, in years 3-4 the AMD system would be miles ahead. Plus I'm sinking less money into wh

          • You'd have been happier with an i3 or i5. I can just hear the fans on my i5 server when I stand with my ears a few inches away from it.

            Both Intel and AMD give the TDP of their four core parts (i5 for Intel, Phenom II for AMD) as 95 watts. The difference is that I believe AMD. And I wonder about your belief that an i5 box would be quieter than mine with similar components, or your definition of what a few inches is, or whether you have the stereo on when you post to Slashdot. I have had people tell me with great assurance than an XBox 360 is quieter than a PS3, or that they can't hear their PS3 when watching a movie, both patently absurd wi

            • by 0123456 ( 636235 )

              Both Intel and AMD give the TDP of their four core parts (i5 for Intel, Phenom II for AMD) as 95 watts

              And I've never seen the power consumption of the i5-2400 go much above 50W in benchmarks. So perhaps Intel are lying; the i5 seems to use about half as much power as they claim it does.

              Perhaps rather than believing either company's numbers you should actually try measuring them?

      • No offense, but I'm typing this on the $350 15" Acer laptop with an E-350 (zacate fusion processor). I played portal 2, start to finish, on this thing at medium settings. It gets about 6 hours of battery life out of light web browsing. Intel may be killing AMD on the low end, but based on my comparison to a $600 HP probook with an i3-2ksomething, it's indistinguishable at web browsing and word processing, and the i3 just fails any time you try to run a game.

        Not saying that the sandybridge i3 isn't a bett

        • You have a point, but you are comparing a business notebook at the inflated suggested retail price (big companies get big discounts) to a consumer grade machine. Try opening the laptop 2000 times, type 1000 hours on it and see which one is still more or less functional. A better comparison would be an HP pavilion. You can get something like a DM4 for $450 on the HP website right now, that compares to your Acer in specs and also has the AMD chipset. The cheapest I3 pavilion is another $100 more expensive. Th
        • Agreed, the E-350 is such a great value, I'm a bit sad that they're not selling closer o their release price points, but it really is a testimony to how well they work.
      • Re:Ya right (Score:5, Informative)

        by sjames ( 1099 ) on Saturday September 03, 2011 @07:55PM (#37299620) Homepage Journal

        Actually, in Opteron vs. Xeon, AMD is doing quite well. Clock speed only gets you so far if you're bottlenecked on memory bandwidth.

        • Of course this gets modded up even though Xeons have had more memory bandwidth than Opterons for over 3 years....

          • by sjames ( 1099 )

            Opteron has 4 channels/CPU vs. Xeon's 3. Both are DDR3, same speed. Previous generations of Intel chipsets had problems achieving full speed on the memory bus in multi-dimm configurations.

            So what would make you say the Xeon has more memory bandwidth?

            • What are you talking about? Curren Opterons have a 2 chip MCM module (see the AMD fanboys squirm over that after they denigrated Intel for doing it first). Each chip in the module has a 2-channel memory controller that isn't any different from the ones on the desktop Phenoms. Manwhile, Intel has been selling single chip solutions with 4 memory channels since 2009, with lower end models having 3 channels (Intel also had a "real" 8 core CPU out before Bulldozer despite AMD's marketing claims that they inve

              • by sjames ( 1099 )

                The high end Xeon chips had THREE memory channels according to Intel right up till the last batch released Q2 of 2011, did they lie or did you mean for over 3 weeks (rather than years)? Note that the Westmere runs in the mid-2 GHz speed range and at 130Watts.

                Personally, I don't care if there's one or 2 dies in the package for the 8 and 12 core chips (though if the workload is memory bount, the 12 core shouldn't be used), it doesn't seem to matter much since there will be 2 or 4 chips on the board anyway. F

      • by LWATCDR ( 28044 )

        "The current situation is Intel is slaughtering AMD"
        Frankly in the consumer space ARM is slaughtering Intel. The truth is that to day 90 of all desktops have more than enough CPU power. The most intensive thing most computers do today is playback HD video. Sure the I7 SandyBridge is blindingly fast but most people don't need the speed or the price tag. The new A8 and I3s show where the future is going. Fast enough with good enough graphics and low price.

      • Basically you are right. AMD has nothing even remotely close to SandyBridge and Bulldozer won't get them there either. I've been a long-time AMD fan, and over the years AMD has saved me bundles of money with their socket compatibility.

        But AMD has to make a socket switch now and there are way too few AM3+ mobos available. Not only that but the mobos that are available are wired for compatibility.. they will work with AM3+ cpus but they won't be able to make use of all the new performance capabilities. So

      • by dbIII ( 701233 )
        Even on the slower moving opteron development it appears your information is stuck in June 2009.
      • by tlhIngan ( 30335 )

        Now that said, Intel may well have decided to hold Ivy Bridge if AMD can't deliver Bulldozer because they don't need to. Sandy Bridge CPUs are just amazing performers, they don't need anything better on the market right now. However I can't imagine AMD colluding with Intel on this. They are not in a good situation.

        Perhaps it's Intel wanting to keep AMD alive for anti-trust reasons. They could very well continue to slaughter AMD, but is it in the best interests of Intel? If AMD dies, then Intel's going to ge

      • 45 nm is correct for the AMD Phenom II series. But the "Llano" APUs for low-end desktops are already in 32 nm. So you could say Intel is half a step ahead right now. Overall, however, I agree that AMD is under pressure and cannot afford artificial delays in their products.

        What they still have are some niches where Intel has slacked off or does not compete for other reasons. The most important one right now are the APUs. AMD's Brazos platform does well on netbooks, and IMHO the LLano is a good choice for che

      • Re:Ya right (Score:4, Informative)

        by Rockoon ( 1252108 ) on Sunday September 04, 2011 @09:46AM (#37302392)

        The current situation is Intel is slaughtering AMD.

        Except where AMD is slaughtering Intel, of course.

        HP, Asus, MSi, and Lenovo have all adopted the E-350 over the Atom alternatives in notebooks and low end laptops.

        Remember that Notebooks and Laptops are replacing desktops in the typical home. Intel is probably pretty worried that they have absolutely no competitor to the E-350 that doesnt both cost significantly more and draw significantly more power.

  • Gee those delays mean the brand new shiny chips will just hapen to come out with Windows 8. Coincidence?

    Not only can you finally ditch that aging Vista or XP machine, with shiny Windows 8 but now you can have a shiny new CPU too!

    • Possibly, but realistically most people would do fine with AMD's Fusion core processors, the ones they've already released. Tthere are legitimate reasons to have more power, but for the things that people typically do, it's more than enough power.

  • by Verunks ( 1000826 ) on Saturday September 03, 2011 @06:14PM (#37299054)
    we have known that ivy bridge will be released in 2012 since april... http://www.maximumpc.com/files/u69/sandy_bridge-e_roadmap_updated.jpg [maximumpc.com]
  • by JoshuaZ ( 1134087 ) on Saturday September 03, 2011 @06:18PM (#37299080) Homepage

    The most naive question to ask if is this sort of delay is relevant to Moore's law and similar patterns. There are a variety of different forms of Moore's law. We've seem an apparent slowdown in the increase in clockspeed http://www.tomshardware.com/reviews/mother-cpu-charts-2005,1175.html [tomshardware.com]. The original version of Moore's Law was about the number of transistors on a single integrated circuit and that's slowed down also. A lot of these metrics have slowed down.

    But this isn't an example of that phenomenon. This appears to be due more to the usual economic hiccups and the lack of desire to release new chips during an economic downturn (although TFA does note that this is a change in strategy for Intel's normal approach to recessions.) This is not by itself a useful data point, so this is not further need to panic.

    On a related note there's been a lot of improvement in the last few years simply by making algorithms more efficient. As was discussed on Slashdot last December http://science.slashdot.org/story/10/12/24/2327246/Progress-In-Algorithms-Beats-Moores-Law [slashdot.org] by a variety of benchmarks linear programming has become 40 million times more efficient in the last fifteen years and that only a factor 1000 or so is due to the better machines, with a factor of about 40,000 attributable to better algorithms. So even if Moore's law is toast, the rate of effective progress is still very high. Overall, I'm not worried.

    • by dbIII ( 701233 )

      most naive question to ask if is this sort of delay is relevant to Moore's law

      Does it have to be? Moore doesn't work at Intel anymore so they might have a new plan. It was going to have to hit a physical limit at some point anyway which Moore would have very clearly known when he proposed it in the first place.

      • It was really More's observation.. and a forward looking view that it would continue to be a trend.. all trends come to an end.
  • I will continue to buy AMD. ive compared my sub $500 AMD rigs with comparable Intel rigs, I don't see why spending 2 to 4 times the amount of money for intel over AMD when AMD does a fine job. My Phenom II 945 has served me well, runs cool, runs fast, everything I put on it it takes like a champ. I have yet to stress out the Phenom. Ive run multiple games on it, audio and video work on it. The only 'advanced' thing I haven't done on it is CAD and seti@home. Why spend 2 or 3 times for the Intel, when all i'
    • I don't buy Intel necessarily for the CPU, I buy Intel for the supporting chipsets. Intel chipsets in most instances are rock solid....with excellent driver support for both Windows and Linux. That being said, I'm glad AMD exists....if AMD didn't exist, it would be necessary for Intel to create one ;)

      -M@

  • I have to wonder how much of this is due to the stagnating economy in much of the developed nations. My recollection is that the last time the economy went south, all sorts of projects were either postponed, put on hold, or simply ended.

  • Processor technology is at a state of gaining more by reducing die size than design. Because of SMP, sophisticated design changes are not needed to gain performance. Intel knows they can seriously move ahead of AMD if the next processor is successfully reduced to 22nm. They might as well wait. AMD isn't able to drop cash on reducing die size on every other release. If they can put off releasing their next processor at a point when they can afford smaller die production they will get a lot more out of i

I've noticed several design suggestions in your code.

Working...