Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Intel Hardware Apple

Apple Considering Switch Away From Intel For Macs 530

concealment sends this quote from Bloomberg: "Apple Inc. is exploring ways to replace Intel processors in its Mac personal computers with a version of the chip technology it uses in the iPhone and iPad, according to people familiar with the company's research. Apple engineers have grown confident that the chip designs used for its mobile devices will one day be powerful enough to run its desktops and laptops, said three people with knowledge of the work, who asked to remain anonymous because the plans are confidential. Apple began using Intel chips for Macs in 2005."
This discussion has been archived. No new comments can be posted.

Apple Considering Switch Away From Intel For Macs

Comments Filter:
  • ... It certainly isn't impossible. People already look at iPads and iPhones as "devices" and not what they really are underneath all that glass and aluminum. Just smaller, simpler "computers". I'd say it's a safe bet that 99% of the Slashdot readership at one point had a computer that looks positively ancient compared to last year's iPhone models, but most people simply don't understand the magnitude of what's been accomplished in technology over the last 30 years.

    Now that people look at iDevices and their non-Apple kin as devices, it just takes some time to convince them that the idea of a "computer" really isn't what they ever wanted. They've always wanted devices, and with OSX and now Windows drawing more and more from the closed ecosystem models they spawned off for the mobile realm, people will eventually come around.

    I give it around two years before Apple comes out with a new line of ARM-based Macbook Airs, though that could change depending on how effectively Intel and AMD (really, just Intel) stave off the situation by getting lower powered x86 options into the marketplace.

  • Only Apple (Score:0, Interesting)

    by Anonymous Coward on Tuesday November 06, 2012 @05:38PM (#41899693)

    Only Apple gets away with completely changing their computing platforms and getting everyone to follow along and pretend it's no big deal. I remember having a Mac guy go on and on for hours about how the x86 couldn't do the things a PPC could do. Then, the x86 Macs come out and suddenly all those "deficiencies" are no big deal. Anyway, what an insane platform.

  • What a handy leak... (Score:0, Interesting)

    by Anonymous Coward on Tuesday November 06, 2012 @05:43PM (#41899793)

    If you're about to negotiate some new contracts with Intel.

  • Re:Why? (Score:5, Interesting)

    by TellarHK ( 159748 ) <tellarhk@@@hotmail...com> on Tuesday November 06, 2012 @05:45PM (#41899831) Homepage Journal

    For the tasks most people want a computer for (or think they want a computer for) an ARM-based solution could work just as well as an x86 based one. Keep in mind that even if Apple made the switch, they wouldn't be making it to the same silicon they're producing today, because they wouldn't need all of the power saving mechanisms that they've had to use for the mobile device markets they're in now. Instead, envision something along the lines of a hybrid machine with one high-end mobile core designed for lower-power usage, and then additional cores that can be brought online as needed with the associated power draw. There are dozens of ways this kind of arrangement could be managed, and people seem to be quick to forget that Apple made some of the big early strides when it came to getting multiprocessor development under control. (Grand Central, for example)

    Additionally, who's to say that they won't have a 16+ core ARM chip running at 3GHz in the next couple years? If die size and power management are less of a premium, that's a lot of raw power that could be thrown at things.

    I think they'll start with something like the MBA, and move up the line from there.

  • by tepples ( 727027 ) <tepples.gmail@com> on Tuesday November 06, 2012 @05:48PM (#41899889) Homepage Journal

    Then you also get alternative/thin boot of iOS.

    That or Apple will follow Microsoft's lead with Windows RT's lack of sideloading and use the transition to ARM ISA as a chance to remove the option to run software that's not signed with an Apple Developer ID. This means Apple would get to charge owners of ARM Macs $99 per year to rent the ability to run Xcode or any other compiler on their own hardware, just as Apple presently does with iOS.

  • Re:64-way, on 1-die (Score:4, Interesting)

    by AK Marc ( 707885 ) on Tuesday November 06, 2012 @05:50PM (#41899913)
    64 A9 quad-core CPUs with 64 on-die GPUs would likely provide more computing power than any Intel x86 chip at lower power usage than frugal modern laptop CPUs (64x0.25W = 16W). Apple would just need to cut the cost and make software to drive it. They'll have longer life and more power than Intel.
  • Re:Why? (Score:5, Interesting)

    by realmolo ( 574068 ) on Tuesday November 06, 2012 @05:53PM (#41899977)

    Apple wants to dump MacOS.

    There is FAR more money to be made from a locked-down OS like iOS that guarantees they get a cut of every app sold. The profits from iOS devices DWARF the profits from MacOS.

    MacOS will be gone in ten years. Less, probably. You'll still be able to buy a Mac, but it will run iOS, and only run "approved" apps. Unless you pay a couple thousand bucks for their "developer" license, in which case you will get a copy of XCode. And a yearly fee on top of that, of course. And probably a limit on the number of apps you can develop before you have to pay more money.

    Apple is NOT about making cool technology anymore. They are about selling content. They're a media company.

  • Re:Why? (Score:0, Interesting)

    by Anonymous Coward on Tuesday November 06, 2012 @06:01PM (#41900097)

    OS X applications are still single threaded, like 99% of all applications. You ever tried writing code for multi-core? Thought not.

    The reality the typical Mac user does nothing more than FB, Twitter and iTunes. Most designers use Win machines for the front and and Linux render farms. Developers? What OS X developers!?

  • Re:Why? (Score:5, Interesting)

    by Sponge Bath ( 413667 ) on Tuesday November 06, 2012 @06:12PM (#41900273)
    What may be happening (and misinterpreted by the press) is Apple exploring a hybrid machine with ARM used for always on iOS services and intel for booting to full OS X. Didn't Dell do something similar where they had an ARM for playing CDs or other small stuff on a laptop without fully booting the OS?
  • by viperidaenz ( 2515578 ) on Tuesday November 06, 2012 @07:32PM (#41901223)
    Not if there is a rogue prisoner opening gates.
  • by Culture20 ( 968837 ) on Tuesday November 06, 2012 @08:06PM (#41901577)

    They'll migrate back to Windows just like they did when Apple ruined Final Cut Pro. The mass exodus to Adobe Premiere running on Windows left FCP as pretty much a non-player at this point for serious video editing.

    Word. I've seen them migrate with other Adobe products too, just because they have to use windows for one purpose, so they start using it for others.

    Thankfully the Xserve debacle caused some higher ups to realize that Linux on cheaper servers is a better option anyway.

  • Re:64-way, on 1-die (Score:5, Interesting)

    by Dekker3D ( 989692 ) on Tuesday November 06, 2012 @08:08PM (#41901599)

    Well, Macs have always been associated with graphical artists. I personally love working on my Cintiq on Windows/Linux and wouldn't touch a Mac with a 10-foot pole (Something big needs to happen before I give any money to something so anti-freedom as Apple), but... yeah. Most people using advanced drawing programs or 3D rendering/sculpting software will need a lot of CPU and GPU power. Some apps lean on one more than the other, but I don't think any 3D artist these days will look at his rendertimes and say "welp, that'll forever be fast enough for me!"

    Programmers are traditionally more Unix/Linux folks, and a lot of programmers use compilers that write bytecode rather than actual executables, so I don't think they will be of much concern to Apple.. but this does mean that multi-OS support will fall behind again. And given that Windows (about 80%?) and Linux (maybe another 5%?) serve a huge share of the desktop users, that probably means that the Mac will be left behind. Beside that, a large share of FOSS software seems to be mostly developed on Linux anyway, so..

    I don't think this'll affect non-Mac users much. It may hurt Apple's bottom line a bit, but the forced upgrades will compensate and probably cause a bit of a jump in profits even. It'll just further segregation between Mac and non-Mac.

  • Re:Why? (Score:4, Interesting)

    by gman003 ( 1693318 ) on Tuesday November 06, 2012 @10:02PM (#41902535)

    Nintendo did this on the Wii - there's a primary PPC processor, and an ARM core on the northbridge that is used for running updates while the console is "off". Worked fairly well by all reports.

  • by maynard ( 3337 ) on Tuesday November 06, 2012 @10:12PM (#41902589) Journal

    This is right.

    It's more than just about creating social and legal controls over a technology that threatens traditional power structures, though personal computing has done that - just look at how social networking has supported political revolts across the world. Governments and their business patrons fear this power shift.

    So, how have they responded?

    The western national economies have transformed their income streams from production to rent collection, which has been ongoing since the 1970s. This has devalued all forms of manufacturing, where raw materials are converted to useful things through work, thus devaluing those who perform labor in the process. It's not automation that has destroyed manufacturing in the United States. In fact, that claim is ridiculous on its face, since - by definition - automation increases productivity which presumably should lead to long term industry success.

    No, instead, free capital flows shifted productive work overseas where for cheap labor - sometimes slave labor - was available. This is called 'globalism'. But we should view the term a misnomer, due to the disparity between how easy it is to transfer capital across national boundaries versus how labor is locked into the nation state by borders and immigration law. It's not 'global free trade', it's arbitrage. This has happened not just in lock-step with deregulating the financial industry - Wall Street - at the expense of labor, but also because of it. For the power shift from government to the financial sector has had the effect of diminishing the political power of citizens - and especially labor - in the process. Because it's pretty damn hard for the poor to exercise real political power. That transformation benefitted both power bases in government and the financial sector.

    But how does all this relate to computing lock-down and DRM?

    It's the model for how to understand vendor lock-down in computing. For open computing platforms decentralize power by freeing people to use computing in ways never intended by the vendor (or government). This used to be called innovation. Back in the 1970s, every personal computer was open. The Apple II shipped with a manual that included schematics. Bus specifications were open. Computers booted to BASIC, a programming language by default. Now, not everyone wants to program and computing shouldn't be viewed strictly from that mindset. But, consider what happened to the minicomputing market as a result. Digital, for example, went bankrupt trying to maintain their vendor lock-in due to competition from open systems - primarily the IBM-PC and its clones we still use today. Because people like freedom, even when they don't directly use that freedom to tinker and create themselves.

    So, I'm arguing that in the same vein that the financial industry gained protected privileges (deregulation) which gave it market advantage over labor, so too are titans of the software and tech industry, IBM, Microsoft, Apple, Oracle, etc have bent law and regulation to their benefit, at the expense of small competitors and even their own customers. Like 'deregulation' for Wall Street, the tech industry has it's own legal maneuver, this time through copyrights, patents, and trademarks, all of which are a form of government regulated monopoly protection.

    And all this in the Orwellian name of 'freedom'. In the financial industry, they called it 'free trade'. In the tech industry it's, 'freedom to innovate'. But in both cases the freedom isn't to decentralized down to small business or citizens, it's centralized up toward the largest market players. It's a freedom to engage in monopoly control over markets, whether the labor market, the tech market, or any other market where players are big enough to buy protection from legislators and the court system. Protection, not from other big industry players - by and large - but protection from small competitors who might innovate their way into market dominance, and protection from custome

  • by tepples ( 727027 ) <tepples.gmail@com> on Tuesday November 06, 2012 @11:01PM (#41902817) Homepage Journal

    that's trivially disprovable if you actually try using one for the minute or so that it takes to enable sideloading

    I was under the impression that sideloading using a developer certificate would disable itself after a month, and Microsoft had ways to detect "fraudulent use of a developer license" as a sideloading method. What other method of sideloading were you talking about? The one that involves buying a 100-seat sideloading license for $3000 [zdnet.com]?

If you think the system is working, ask someone who's waiting for a prompt.

Working...