Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel Businesses Cellphones Communications The Almighty Buck Hardware

Intel To Buy Smartphone Chipmaker Infineon For $2B 95

sylverboss writes "Intel Corp., the world's largest chipmaker, is close to an agreement to buy Infineon Technologies AG's wireless business, three people with direct knowledge of the discussions said. When it comes to desktop, laptop and server chips, Intel's pretty much got a lock on the market but everyone can see the writing on the wall: mobile chips and architectures are the future of computing thanks to the popularity of smartphones, but Intel doesn't have anything to offer in that regard. Don't know Infineon? You should: they are the guys who have supplied Apple with their iPhone baseband chips since 2007."
This discussion has been archived. No new comments can be posted.

Intel To Buy Smartphone Chipmaker Infineon For $2B

Comments Filter:
  • Re:Infineon? (Score:2, Informative)

    by ELCouz ( 1338259 ) on Sunday August 29, 2010 @05:09PM (#33410688)
    they make a lot of chip including RAM. They have a big market share for the memory.... just look a their competitors... TI, Broadcom, STMicroelectronics, Marvell, Freescale, NXP, Renesas, International Rectifier, Fairchild Semiconductor, Semikron, Dynex Semiconductor. Yup, definitively an important company!
  • by DDDKKK ( 1088707 ) on Sunday August 29, 2010 @05:10PM (#33410692)
    Rambus does not produce RAM. They develop and license RAM related technologies. Infineon is one of the licensees (as is Intel).
  • by klingens ( 147173 ) on Sunday August 29, 2010 @05:11PM (#33410698)

    No it doesn't. Infineon hasn't made RAM in a long time. They sold off their RAM business in 2006 naming it "Qimonda". in January 2009 Qimonda declared bankruptcy.

  • Re:Atom? (Score:3, Informative)

    by klingens ( 147173 ) on Sunday August 29, 2010 @05:14PM (#33410712)

    The Z600 in 4Q10 is the first Atom supposed to go into Smartphones.
    What Intel bought are not general purpose CPUs like Atom. It's the high frequency chips talking to the mobile base stations. Think "modem chips for mobiles". The chips running applications on the phones are totally different ones.

  • by YesIAmAScript ( 886271 ) on Sunday August 29, 2010 @05:16PM (#33410724)

    Atom is maybe a 2W chip at best.

    Whereas the ARM CPUs used in phones are under 0.5W.

    In a device like a smartphone, you simply cannot find room to make the battery larger to make up for the extra power used. Not to mention the cost of the larger battery.

  • Re:Atom? (Score:5, Informative)

    by WrongSizeGlass ( 838941 ) on Sunday August 29, 2010 @05:18PM (#33410738)

    Intel's Atom chips are low power. They're not good for putting into smartphones?

    They may be, but these are baseband chips (EDGE, GMS, etc) not the main CPU's.

    Are there some Infineon chips now used for only smartphones that will show up in netbooks?

    Not unless you want to hold your netbook up to the side of your head and use it to make a phone call ...

    Do they run Linux? Do they run x86 instructions?

    No.

    Not unless And if not, will Intel sustain a product line that splits its main CPU culture away from x86?

    Not everything Intel produces runs x86 instructions.

  • Re:Atom? (Score:5, Informative)

    by John Hasler ( 414242 ) on Sunday August 29, 2010 @05:18PM (#33410742) Homepage

    Infineon doesn't make cpus. Intel is probably most interested in their rf stuff. Believe it or not, there's a lot more to a cellphone than the processor.

  • Re:Infineon? (Score:5, Informative)

    by klingens ( 147173 ) on Sunday August 29, 2010 @05:18PM (#33410744)

    Infineon hasn't made RAM for 4 years now when they spun off Qimonda. Qimonda itself went bankrupt early 2009.

  • by MtHuurne ( 602934 ) on Sunday August 29, 2010 @05:27PM (#33410782) Homepage
    Contrary to what the headline suggests, Intel is not buying all of Infineon: they are negotiating to buy the wireless division.
  • by Sycraft-fu ( 314770 ) on Sunday August 29, 2010 @05:32PM (#33410808)

    The deal with Rambus was a purely business one. Rambus paid them a good deal of money to use Rambus technology. Also, at the time, it really WAS faster. Wasn't faster enough to be worth the money and of course scaled like shit, but a Rambus P4 was quick. So Intel made the decision to use RDRAM. However it turned out to be a bad decision as DDR-SDRAM quickly eclipsed it speed wise, which helped AMD with the edge they had at the time. So, when the deal was up, Intel chose not to continue using RDRAM, and still does not to this day. Rambus does make new RAM products, XDR RAM is their current thing and the PS3 does use it. However Intel decided it was in their best interests not to.

    Companies generally aren't buddies or anything, they just have interests that may match up. Intel though RDRAM was the way to go, especially since they made a lot of no-cost money on the deal. I mean $100 million is nothing to sneeze at. If someone is willing to pay you that to use their technology, and their technology looks like it works, then great. However that doesn't mean it was "BFF for life," or whatever. It didn't work out, the arrangement ended, that is that.

  • About Infineon (Score:5, Informative)

    by dmesg0 ( 1342071 ) on Sunday August 29, 2010 @05:56PM (#33410914)

    If Infineon doesn't ring a bell to someone, the name Siemens surely does. Infineon was the semiconductor division of Siemens, before being spun off into a separate company.

    Infineon's current market cap is around 5B, so Intel is rumored to buy about 1/3 of the company (assuming some premium over the stock price).

  • by zlogic ( 892404 ) on Sunday August 29, 2010 @06:02PM (#33410934)

    Intel thought they'd develop an x86 chip with the power requirements of an ARM chip, that's why they sold their ARM division to Marvell. I guess that was a bad move considering the progress ARM made during the last few years.

  • by Grishnakh ( 216268 ) on Sunday August 29, 2010 @07:34PM (#33411344)

    Actually, it wasn't fast at all, depending on your definition of "fast". Rambus RAM had huge bandwidth, but terrible latency. So it was great for things like streaming media, and terrible for just about everything else. At the time, all Intel could think about was bandwidth, and they didn't give a second thought to latency. They basically thought everyone was going to start using their computers for watching movies, video editing, and little else. So they designed the P4 with a horribly long pipeline that meant any context switching resulted in terrible performance, and they used Rambus RAM which was perfectly matched to their pipeline and memory channel bandwidth. Worked great if you were doing video editing, but most other applications had mediocre performance, with sheer clockspeed trying to compensate for the huge penalty of poor latency and pipeline flushes. In the end, people were stuck with fancy new computers with horrifically expensive RAM which weren't any faster than their old PIIIs for most applications, yet consumed 3 times as much power, making their offices very warm.

    The whole thing was just a bad idea. AMD pretty quickly realized what was going on, avoided Rambus RAM like the plague, and concentrated on better performance at lower clockspeeds. AMD made huge inroads against Intel during this time. After user rebellion (including people building their own motherboards using Intel's notebook CPUs, which had a different architecture that had much lower power consumption with better performance), Intel finally dumped the P4 "Netburst" architecture and moved to "Core". They also dumped their CEO Craig Barrett who was responsible for this disaster. Since then, they've been greatly outperforming AMD, probably due to their larger size, and their strong fab technology (Intel makes all its own chips, AMD I believe outsources theirs).

  • Different makret (Score:5, Informative)

    by Sycraft-fu ( 314770 ) on Sunday August 29, 2010 @07:59PM (#33411448)

    Atoms are low power, and despite what the ARM fanboys like to say, they do a lot given their power budget. However they are still higher power than you want for mobile devices. They are targeted at low end PCs, like netbooks, or perhaps some higher end embedded applications. ARM chips (most of them at least) use far less power. When you are talking the tiny batteries in cellphones, this matters. Going from a half a watt chip to a 2 watt chip means 4x the power draw. Given that the CPU is one of three major components that draw power (the LCD and radio being the others) you don't want this.

    For example my BlackBerry has a 4.3 watt-hour battery. That means just what it sounds like: It could provide 4.3 watts for 1 hour. Ok so a CPU that uses 2 watts could drain the battery by itself in 2 hours, even if the screen was off (which of course it wouldn't be). A Half watt CPU would last 8 hours on the same battery. Big difference for a small device.

  • by RzUpAnmsCwrds ( 262647 ) on Sunday August 29, 2010 @11:34PM (#33412248)

    They basically thought everyone was going to start using their computers for watching movies, video editing, and little else. So they designed the P4 with a horribly long pipeline that meant any context switching resulted in terrible performance.

    If you don't know much about CPU architecture, please don't make a bunch of random statements about the P4.

    First, the pipeline length has minimal impact on the speed of context switches. Context switches are relatively infrequent (compared with the CPU frequency) and relatively slow (typically several hundred cycles at a minimum).

    The major downside of pipeline length comes from branch mispredicts. Branch mispredicts hurt you more because you have to flush more wrong instructions. Additionally, the scheduler is less able to parallelize instructions because instructions with data dependencies need to be spaced further out in the pipeline (forwarding doesn't help you unless the result has actually been computed, and in long pipelines there are typically several execution stages). Some of this can be improved with tactics like better branch prediction or multi-threading, but ultimately you give up IPC in a longer pipelined design.

    Second, the P4 was not designed for "watching movies, video editing, and little else". It was designed to be fast. When Intel was designing the P4, the IPC-bag-of-tricks was starting to run out. The P6 (Pentium Pro, later evolved into the Pentium II/III) already had all the common improvements including multi-level, fast on-chip caches, a fully pipelined design, out-of-order execution, branch prediction, and multi-issue. The bottom line is that Intel realized (like everyone else) that making the chip wider or increasing caches really didn't do much for performance anymore. To keep seeing dramatic improvements in single-threaded performance, we either needed a completely new bag of tricks or we needed much higher clocks. Intel figured that they would make a CPU that (architecturally) could hit very high clocks, which means very deep pipelines to meet timing constraints. Yes, P4 would have lower IPC, but it would more than make it up in clock speed.

    For a while, it worked. P4 was not a huge winner at first but over time (with Northwood) the P4 managed to out-gun AMD's lineup and become one of the fastest CPUs available. It does't matter if the Athlon could retire more instructions per clock, the P4 was clocked dramatically higher.

    The problem is that somewhere around Prescott, the process technology ran out of gas. Leakage current became an issue more quickly than Intel had anticipated, thermal issues became problematic, and despite Intel's tricks (sockets that could handle more power, BTX, etc.) it became clear that people just weren't going to put a 400W CPU in their machine.

    None of this is really a problem with the P4 architecture. With the right cooling and power, P4 can hit 8GHz. That's higher than any Intel or AMD CPU before or since.

    You'll hear people say that P4 was a marketing decision. While I'm sure that the high clocks did benefit marketing, people who know the actual architects will tell you that it had more to do with chasing single-threaded performance than it had to do with marketing.

    Some people say that the P4 was optimized for media. While it's true that highly predictable code (e.g. loopy scientific code and media encoding) performs especially well on the P4, compared with the Athlons of the day (before Athlon 64) so did everything else. You can't compare a 1.5GHz Athlon XP to a 1.5GHz P4 and argue that the Athlon is better because it's faster. P4 was specifically designed to make up for its lower IPC with very high clocks.

    The whole thing was just a bad idea. AMD pretty quickly realized what was going on, avoided Rambus RAM like the plague, and concentrated on better performance at lower clockspeeds. AMD made huge inroads against Intel during this time.

    AMD made inroads very late in P4's life af

  • When it comes to desktop, laptop and server chips, Intel's pretty much got a lock on the market but everyone can see the writing on the wall: mobile chips and architectures are the future of computing thanks to the popularity of smartphones, but Intel doesn't have anything to offer in that regard.

    Server market is a different ball game. Xeon are only in the low end server. IBM has lead in middle and high end servers with their P and Z systems. P and Z chips are custom designed for IBM systems only.

Never call a man a fool. Borrow from him.

Working...