Why Apple and Microsoft Are Using Last Year's Skylake Processors In Their New Computers (gizmodo.com) 136
Apple released new MacBook Pros yesterday that feature Intel's year-old Skylake microarchitcure, as opposed to the newer Kaby Lake architecture. Two days earlier, Microsoft did the same thing when it released the Surface Studio. Given the improvements Kaby Lake processors have over Skylake processors, one would think they would be included in the latest and greatest products from Microsoft and Apple. Gizmodo explains why that's not the case: In the case of the new 15-inch MacBook the answer is simple. "The Kaby Lake chip doesn't exist yet," an Apple rep told Gizmodo. Kaby Lake is being rolled out relatively slowly, and it's only available in a few forms and wattages. The 15-inch MacBook Pro uses a quad-core processor that has no Kaby Lake equivalent currently. That particular laptop really does have the fastest processor available. The same goes for the Microsoft Surface Studio and updated Surface Book -- both also use a quad-core Skylake processor with no Kaby Lake counterpart. But the Studio and Surface Book are also using much older video cards from the Nvidia 900 series. Nvidia has much faster and less power-hungry chips (the 1000 series) available based on the Pascal architecture. Microsoft's reasoning for going with older video cards is nearly identical to Apple's for going with a slower processor in its 13-inch MacBook Pro: the Nvidia 1000 series came out too late. The major intimation was that Kaby Lake and Pascal came so late in the design process that it would have delayed the final products if they'd chosen to use them. New technology, no matter how amazing an upgrade it might be, still requires considerable testing before it can be shipped to consumers. One minor bug, particularly in a system as engineered as the Surface Studio or MacBook Pro, can turn catastrophic if engineers aren't careful. In the case of Microsoft, it's frustrating, because that old GPU is significantly slower than the Pascal GPUs available. It's a little less frustrating in Apple's case, largely because of the old processor microarchitecture that Apple elected to shove into its new 13-inch MacBook Pro. Apple went with a new Skylake dual core processor that draws a lot of power -- more so than any Kaby Lake processor available. It then uses all that extra power to ramp up the speeds of the processor. Which means it is capable of pulling off speeds that can actually match those of the fastest Kaby Lake processor out there. The only downside to this decision is battery life.
Re: (Score:2, Insightful)
It's just a shame that the Macbooks are capped at 16GB RAM, no 32GB option. For battery life reasons.
Re: (Score:1)
Anyway this is all meaningless drivel. Apple designed a laptop to run 10 hours. Double the memory, add more power hungry graphics chips, and you get maybe 5 hours if you're lucky. So then Apple gets flamed for having Dell/HP battery life. Well they will get flamed by Slashdot nerds no matter if they invented time travel and warp drive. But for the rest of the normal population, they want longer battery life, not minuscule speed bumps with a CPU change.
Re: (Score:3)
not meaningless at all, 5 hour battery life is fine.
my macbook is only unplugged for two or three hours for meetings anyway.
oh, that's not your use pattern? so fuck you, you can get an 8GB or 16GB one
Re: (Score:1)
What do you use more than 16GB of RAM for, exactly? I switched to a Macbook Pro 15" from linux a couple of few ago. I had tried a Mini when they first came out and knew that OS X was RAM-hungry, so I upgraded my MBPro to 16GB when I ordered it. With a browser, garageband,spotify, and other apps all running at the same time I see no slowdowns whatsoever. And this is a 2014 model. I would think the newer ones are even more efficient.
I've been using an 8 GB 2015 MacBook to edit Photoshop files running into the hundreds of megabytes, my biggest files are in huge resolutions that weigh in at around a gigabyte and I can't say I've been pining for 16GB. I'm sure it would probably help but I don't think it would make a truly massive difference. Solid state discs and USB3 have done more to speed up my work because they've massively cut the time it takes to load, save and, in the case of USB3, transfer files than any CPU upgrades have done.
Re: (Score:2)
Multiple virtual machines. Running lots of applications simultaneously, as a multimedia developer will sometimes do. Editing very large images or 4K video. Developing and building large software packages. Those are all tasks that can benefit from having more than 16GB RAM.
Right now, those users are a small subset of the total user base. But they do exist.
Re: (Score:2)
Seriously... I do have 32GB in my Mac Pro, but I rarely use more than 17GB, and that's including cache, etc. that could easily be freed without a substantial impact. That's with a couple of browsers, iTunes, Eclipse, Xcode, email, Twitter client, Slack client, Skype, OneNote, RSS reader, database IDE, and a smattering of other stuff running at once. I upgraded my (much older) MBP to 16GB, and RAM has never been the bottleneck.
Re: (Score:2)
the point is customers should be able to buy the amount of ram they want for their usage pattern.
Re: (Score:2)
Re: (Score:2)
Not brand loyalty, it's correct tool for the job. Windows is the wrong tool for my job.
Re: (Score:1, Insightful)
When you see Apple users excusing poor design decisions as good ones you know the Apple of Steve Jobs is gone. Not only considering it a good one saying it is the superior choice (well because of reasons).
iPhone users have been justifying Apple's poor design decisions for years, including back when the Turtlenecked one was alive:
* Battery life is essentially capped to less than a day, and every improvement in power saving has been offset by a cut in phone thickness, instead of giving us the option of a battery that lasts longer. "Thinner is better."
* Remember antennagate? "You're holding it wrong."
* iOS upgrades that break functionality. "You just have to learn it."
* Pulling U
Re: (Score:3)
> Pulling USB mass storage support so that people can't copy music. "[crickets]"
What do you mean?
Re: (Score:3)
> Pulling USB mass storage support so that people can't copy music. "[crickets]"
What do you mean?
The closest I can get is his Chevy can beat your Ford any day.
Re: (Score:2)
BT keyboard is one of those weird things that's useful occasionally. I've enjoyed reactions from people, "How do you text so fast???" when I would pair a BT keyboard to my iPhone before iMessage could bridge over from a laptop with a real keyboard. I like my BT heart rate sensor for running too...
Re: (Score:2)
Nah... We did that for Steve too. Reality Distortion Field was a thing...
Apple users are a bit more willing to look at realistic real world compromises and determine that trading one thing for another makes more sense for some of us. I can't charge & listen to wired headphones on my iPhone 7. I used to do that on my 6. Like twice a year when I had to travel on Amtrak. Now I'll do something
Re: (Score:2)
My understanding is that RAM becomes fragmented during use and the OS typically makes no effort to remedy this since on most (all?) computer systems RAM access is fast and uniform regardless of fragmentation.
Anyway, to achieve the power benefits of shutting down the second bank, the OS will need to run a RAM fragmentation and consolidation routine to clear that bank before it can be shut off.
I'm not making excuses for Apple, it's an interesting idea. Does anyone know of other hardware that does anything lik
Re: CPU ain't all there is (Score:1)
Get an iPhone 7 then (Score:2)
A standard headphone jack (cuz bluetooth audio sounds like shit even on mediocre headphones)
iPhone 7 ships with a standard headphone jack (via dongle) or superior digital audio interface if desired (which the included WIRED headphones use)
An SDCC card (cus the vendor trumpets xx gig storage, without mentioning 3/4 xx storage is crapware you can't delete)
iOS has a low and predictable memory footprint and does not ship with crapware.
Nougat, cuz it's been out for a month or two now
From personal experience yo
Maybe they want to sell BOTH product lines ? (Score:4, Interesting)
FOMO? (Score:3, Interesting)
No, FOTU. Fear Of The Unknown
Current chipsets have enough power to make any device seem very quick to the average users. Only the super high-end buyers would even be able to name the latest. Why risk using a brand new chip?
How many incremental units do you ship because you used the latest new chipset v. downside risk of potential issues with a chip that has not been tested in a full market release?
It's math. Nothing complicated about it.
Re: (Score:1)
If computer manufacturers had taken your advice, we'd still all be running machines with 8086 processors.
So how does a chip get a "full-market release" unless the company that is supposed to be the super-premium level S-Rank of personal computing actually uses it?
Re:FOMO? (Score:5, Informative)
Sorry, but only person who has absolutely no clue about how a hardware product is being developed (and how long does it actually take!) can say nonsense like this.
A new product like the Surface computer or Mac Book is in development for more than a year, often even 2-3 years. And in the latter stages you need actually a stable and working system so that things like drivers can be developed, OS adapted, demo units produced, CE/FCC testing done, etc.
So if a new CPU/chipset combo shows up in the last 9-12 months of the cycle, it is simply too late - it would delay the release of the product by at least that much. This is *not* about just swapping a motherboard/CPU/GPU - the board for the chips needs to be actually *developed* first, before you can even start thinking about integrating it into a product.
The risk mitigation is also important, but that comes into play only after everything above is sorted out already. If there is nothing new to put in your product, you have no "unknown" to fear in the first place.
Re: (Score:1)
Then how do other manufacturers that released laptops with Kaby Lake did it?
Re: (Score:3)
indeed. The German Tuxedo, ahead of Syst 76 and else, show among many others a config with KL il 17500 , 32GB RAM, all possible ports (incl USB 3-1) , up to 3 SSD HD incl. 2048 fast ones , removable battery (yes), 2kg , the preloaded Linux you want, all of this within roughly the same cost.
Re: (Score:2)
Because these guys are shipping what is basically the reference design from Intel packaged into a case, sans custom OS and with very little to nothing to develop?
That's not quite apples to apples comparison. Apple has pretty much everything custom - the motherboard, the OS, the peripherals on the MB, ton of tuning and tweaking so that the system doesn't only boot but actually runs well, etc.
If you want to compare, then look at major manufacturers that are using custom motherboards - e.g. DELL or Lenovo.
Re: (Score:2)
> They could wait 6 more months and create a modern notebook.
But why? The latest Macbook Pros had Broadwell and Haswell. It's not unreasonable for the 2016 models to have Skylake. Next year's refresh will probably have Kabylake, or Cannonlake, depending entirely on what exists for their needed spread of things. They refresh every year, going to whatever processor is best. Why delay for half a year to match Intel's (ever changing) cycle?
Perhaps if nVidia would quit changing BGA pinouts (Score:3)
Perhaps if nVidia would quit changing BGA pinouts, companies would be more likely to substitute their newer processors.
Of course if they did that, companies might also substitute a competitors part instead. Then nVidia would end up having to compete on price/performance. And no one wants that.
Re: (Score:3)
nVidia already competes on price and performance... AMD is just not a great competitor, and basically nobody else is bothering to try.
There is no way in hell a system designer is going to substitute a newer part, unless they can:
(A) do it without a redesign/board relayout
(B) do it in a way that lets them back out of the decision when the newer part screws up horribly
If you object to the "when" in option "(B)", then you can object by making the part not fail, when I have the option of backing out the part choice. If you don't fail, I don't back out the part choice.
It's really very simple.
Re: (Score:2)
One word: "daughtercard".
Whine about thin form factors not allowing for it all you want, but you're wrong.
So because you can't get your shit together and keep you BGA layout the same, I have to make up for it by making a separate carrier card that rearranges the pins so that they are the same again on an edge connector?
It's not goddamned rocket science.
I agree: if you want a PCIe interface, then export one at the BGA level, and don't make me add cost to my product because you are too lazy to route the pins yourself.
Or, you know, I could just use your older product that I know already doesn't suck, and you can just wait a year to 10 months for t
Re: (Score:2)
You sound like the engineering version of the salesman that says "Well I've sold this feature to the client, so you make it work" regardless of how hard it does everybody else's job. Marketing will have a harder time trying to sell new features, accounting might find they can't charge as much of a premium so margins are shit, but the system designer has decided there's "no way in hell" we're replacing that CPU/GPU. Maybe if it's capacitors on a board but when it comes to headline features I expect those cho
Re: (Score:2)
Obviously you need to inform them what it'll cost in time, resources and risk and to push back when they make unreasonable demands for changes - like do we all - but I doubt it's really "that simple".
They got that: having a different BGA is going to cost them 9 months (minimum) of not selling their chip on Microsoft Surface or Apple MacBook products.
I don't care. (Score:4, Insightful)
This doesn't matter to me at all.
What matters to me is:
1) Moderately powerful discrete GPU options
2) Anti-glare LCD panels
3) Ports (you know, things like USB 2.0/3.0, Ethernet, headphone/microphone jacks, DisplayPort, etc)
4) More than 16GB of RAM
5) User replaceable batteries, OR a built-in battery of sufficient capacity this doesn't matter
6) Keyboards with a reasonable amount of key travel (0.5mm or whatever it is on the nMBP is hardly sufficient)
7) Apparently, I can add "keyboards with a reasonable amount of physical keys" to this list as well
A quad core CPU would be nice. Beyond that, I don't really care because anything "i7" is already fast enough for me. I don't need the latest greatest CPU the moment it comes out. It would be nice if the rest of the machine were kept up to date though, in terms of GPU options and other stuff, so that when I do decide to purchase a machine I'm actually getting something indicative of modern day technology (even if the CPU is a generation behind). Situations like the MBP (where everyone waited for this "major update") and nMP are pretty much inexcusable for a company with $200B in the bank.
You should get a MBP then obviously (Score:2)
1) Moderately powerful discrete GPU options
15" MBP is a Pascal based GPU, not the most powerful but fairly powerful. 4GB at max.
2) Anti-glare LCD panels
They have been since forever. My 15" from 2013 has anti-glare stuff on the screen.
3) Ports (you know, things like USB 2.0/3.0, Ethernet, headphone/microphone jacks, DisplayPort, etc)
It has four ports that are any of those things you want plus more, with a very high rate of transfer.
4) More than 16GB of RAM :Not impossible you know. [stackexchange.com] It will just cost a lot
Re: (Score:3)
It's just human nature to list his requirements as he did so I don't blame him. Would you rather have a more complete list? Such as list off requirements for things like having a QWERTY keyboard, run on 120VAC, have a track pad, etc.?
When you go shopping for a car do you specify to the dealer that it have four rubber tires and a windscreen? People will specify what differentiates newer models from older and competing new models. At one time people would have specified things like intermittent wipers and
Re: (Score:2)
Was anything he stated not reasonable to want in a notebook?
Keep thinking you're edgy though.
You forgot to add you need powerful memes of hipster mac users and overpriced garbage.
Edgy? Hell no. I just expect my computer to work after an update. That's not edgy, that's just what people should expect.
Text (Score:5, Insightful)
Did they really need that much text to explain the situation? I feel like that paragraph contained a lot of words, but said very little.
Re: (Score:2)
And even then, it doesn't explain the whole situation.
Apple typically uses the Intel quad-cores with the high-end integrated graphics (Iris Pro, or whatever it's called now). And although they were published on Intel's Ark database, they didn't have a price and were not used in hardware until June or so.
Thus, the new 13" MacBook Pros use 4 month old chips. That's not my definition of old.
Please someone correct me here. Intel's release schedule has gotten so complicated that I can't keep up.
Re:Text (Score:5, Informative)
From what I can tell (I don't think Apple has given us the chip numbers), it goes like this:
(remember that "i7" and "i5" don't have meanings- they are just marketing garble, and don't, for instance specify the difference between hyperthreading and non-hyperthreading, or two and four cores: all of these chips have hyperthreading)
https://en.wikipedia.org/wiki/... [wikipedia.org]
(I could have messed up something in transcription)
13" Cheap Model, with TDP 15W:
base: 2 core i5-6360U @ 2.0GHz single core boost to 3.1GHz with Iris 540, (listed as unreleased on wikipedia)
high end: 2 core i7-6660G @ 2.4GHz single core boost to 3.4GHz with Iris 540 (listed as unreleased on wikipedia)
13" Spensy Model, with TDP 28W:
base: 2 core i5-6267U @ 2.9GHz single core boost to 3.3GHz with Iris 550 (listed an unreleased on wikipedia)
midline: 2 core i5-6287U @ 3.1GHz single core boost to 3.5GHz with Iris 550 (listed as unreleased on wikipedia)
high end: 2 core i7-6567U @ 3.3GHz single core boost to 3.6GHz with Iris 550 (listed as unreleased on wikipedia)
In this case, all of the high end models have Iris Pro 550, and all of the low end models have Iris Pro 540. Intel's actual highest listed Iris Pro models are Iris Pro 580, but all of those are on chips that are either pretty expensive, have a higher TDP, or both.
Meanwhile, the 15" laptops all have Radeon graphics cards in them. These have chips that offer more processing power, but less graphics power (with the obvious assumption that the Radeon graphics will be used for that purpose).
15" models all have TDP 35W chips.
15" 256 GB model base: 4 core i7-6700HQ @ 2.6GHz single core boost to 3.5GHz with HD 530 (listed as Sep 1 on wikipedia)
15" 512 GB model base: 4 core i7-6820HQ @ 2.7GHz single core boost up to 3.6GHz with HD 530 (listed as Sep 1 on wikipedia)
Both model high end: 4 core i7-6920HQ @ 2.9GHz single core boost up to 3.8GHz with HD 530 (listed as Sep 1 on wikipedia)
These models are all generally more capable than similar models released earlier. It is likely that Intel and Apple actually reached an agreement via branding and capability on these: it is likely not a coincidence that Intel happened to have highly compatible i5/i7 branding for each step of Apple's needs, for instance.
Regardless, I've seen folks pointing out that Apple really IS using the best Intel chips available on social media, including doing it myself some, as people were all 'muh kaybee layke?' over the last day. These chips are a mix of hyperthreaded 2 core chips with Iris 540 or Iris 550 (on the 13 inch) and hyperthreaded 4 core chips with the lesser HD 530 (on the 15 inch). Meanwhile, the only Kabylake that looks like it could be show up to this party at all is the 7500U, a 15W chip with 2 cores, going from 2.7GHz base to 3.5GHz single threaded boost with HD 620. This chip could maybe have sat in over the cheap model (it costs more though), would require just that one model to be designed and tested around Kabylake stuff, and wouldn't have the Iris graphics (and doesn't have a graphics card). Intel certainly doesn't have the Kabylakes needed to fit their intended build case.
Re: (Score:2)
Excellent post, should be modded up!
Re: (Score:1)
You're almost right about the best possible CPUs, Apple could have released low end (2 core/4 thread) MacBook Pros with Kaby Lake CPUs, but I don't think marketing would allow it.
Apple only really cares about the immensely profitable iOS devices. The iPad Pro is too close to the price of the abandoned MacBook Air and the forgotten MacBook, and explains why there's no touchscreen and Apple Pen on new Macs.
About release dates, according to ark.intel.com all these CPUs were released in Q3 2015, except the i7-6
Re: (Score:2)
Yea, it's the 6660U. The G was a typo. G and U are nowhere close to each other on my keyboard, so no clue how that happened.
I mention the possibility of shoving a Kabylake into the low end, along with some theories as to why they would not (it could require different chipset and testing, it lacks the top end graphics option), and yea, marketing could be a part as well. But if there are valid technical reasons that we can see, there's probably more that Apple can see.
> Apple only really cares about the
Re: (Score:1)
It still is a crappy product and a lemon. I mean that sincerely and not a troll.
First off the AMD graphis is an RX 450! 450 you know the gpu that has about 1 terraflop or about the speed of a 2011 era card and probably close to your cell phone??! Even the consoles of 2013 have the same quality graphics.
Where is the "PRO" in this? Apple's current pro has 4 year old hardware while its non pro version is more modern. Only a dual core skylae? You're kidding for a $2700 system? GOD Almight!
Even if they had to st
Re: (Score:2)
> First off the AMD graphis is an RX 450!
The low end 15" costs 2400 bucks and has a "Radeon Pro 450". The high end 15" costs 2800 bucks and has a "Radeon Pro 455". Both can be upgraded to the "Radeon Pro 460". You are correct about the "1 teraflop" in the low end one. I don't *think* you can straight compare to the desktop RX cards, and I don't *think* that teraflops is the best metric (especially when comparing to consoles). That being said, it is absolutely clear that these are not super powerful
Re: Text (Score:1)
Oh please the current Macbook pros are 2012 era hardware. They are never current or fast since Steve Jobs passed.
If I am paying premium I want professional grade and up to date components. Who gives a shit about USB-C when the CPU chokes as soon as you compile code, do video editing , or run virtual machines. I own a dual core hyper threaded and know first hand!
Sorry, admit Apple lost. This is a great MacBook air.
Re: (Score:2)
You aren't backing up your absurd statements with data. Do you really want me to refute your "2012" claims? You have multiple generations of processors between now and then, the RAM in question wasn't used in 2012, you couldn't even get this level of graphics processing in a laptop, USB-C was years away still, etc.
You own a "dual core hyperthreaded" that appears to be a Haswell i3. The lowest end macbook pro is faster than that by a lot, and costs 1500 bucks. That's a lot, but you are constantly compari
Re: (Score:2)
Fine here is my source on older hardware [theverge.com].
And here is the cpu which is a glorified i3 with hyperthreading [wikipedia.org] also called the i5. It is not the quadcore model.
Apple has alot of explaining to do. If this were a normal company this product would bomb unless priced appropriately sub $1000. Like I said this is a 2016 MacBook Air. Not a power anything.
Re: (Score:2)
Oh, by "current Macbook", I thought you meant the ones that came out (which are *technically* current, in that you can purchase them). Yes, the ones based on the 2012 refresh are quite fairly characterized as four year old hardware, even though that does ignore hardware such as the CPU and other parts that get refreshed yearly. Apple does that with a lot of their hardware.
The link to wikipedia is interesting, but calling it a "glorified i3" doesn't make too much sense. The "i3/i5/i7" don't have any actua
Re: (Score:2)
On boot, it's a standard top row function
Re: (Score:2)
> Now, Vim/Emacs would be well poised to use that touch bar.
In insert mode, it could say:
ESCAPE
Then when you press it and switch to command mode it could dynamically change to
MAKEBEEP
Re: (Score:2)
Re: (Score:2)
Basically MS and Apple selected the CPUs and GPUs in their latest computers based on practical problems of release dates. These decisions were not to screw you as the consumer over. Film at 11.
Nonono!
Microsoft used intelligent and astute marketing decisions that are already showing how smart they are, and apple is a bunch of goddamn hipsters selling overpriced shit to stupid people that like shiny things
Re: (Score:2)
You are articulate and sound very well educated. You large vocabulary is impressive. The use of a maximum of two syllable words is astute. You must have a well paid, highly technical, white collar job and are definitely not a janitor or drive thru worker. English certainly must be your first language.
I speak and write at the level of my audience.
Re: (Score:2)
Got to get those Christmas sales I guess.
Both MS and Apple should have waited 3 months. Much much better graphics could have been on their high end products. I mean why pay $2700 for a MS Surface Design with a slow 960 GPU? Also it is unforgivable that Apple included just a dual core with hyper-threading on their so called professional line.
I own the Haswell version of this chip and it is not anything like a real quad core when you add loads that professionals use it starts to break quickly in Visual Studio
Re: (Score:2)
Except then you'd never release anything, ever, because there's always "something better" coming in 3 months.
Re: (Score:2)
Well they released them knowing the components would be out of date at launch for ultra expensive prices. I mean who pays $2700 for a PC anymore? Both of these are insane and yes AMD fusion is 2 months Kaby Lake is practically here now. If they waited for 3 to 6 months they would have ultra new components in their ultra expensive products at launch for a longer product cycle.
Just as an example the nvidia 10xx series is instrumentally faster as even their low end 1060 performs like a high end 970 last year.
Re: Text (Score:3)
Re: Text (Score:2)
These are pros. Not airs for consumers. Yes the newer GPUs use Samsung 14 nm processes compared to the 28 nm from previous generation. Big boast in performance.
These are supposedly for professionals. Not capable of any real work dealing with VMware fusion, Adobe premiere, compiling code, or anything else a professional would use.
Re: (Score:2)
These are pros. Not airs for consumers
[sarcasm]And pros never buy during the Christmas season. And consumers never buy the MacBook Pro.[/sarcasm]
Yes the newer GPUs use Samsung 14 nm processes compared to the 28 nm from previous generation. Big boast in performance.
[Citation Needed]. My information says that the Radeon Pro 455 which is Radeon Arctic Islands architecture and is manufactured on TSMC 16nm FinFET process not a 28nm process. [wccftech.com]
These are supposedly for professionals. Not capable of any real work dealing with VMware fusion, Adobe premiere, compiling code, or anything else a professional would use.
Let's look at this argument. You are saying that pros would benefit greatly enough from using Kaby Lake over Skylake. The fastest mobile Kaby Lake is the Core i7 7500u [cpubenchmark.net](cpu score: 5381) vs a Core i7 6700T [cpubenchmark.net] (cpu score:8971) which A
Re: (Score:2)
Please post the cpuinfo of the Haswell chip you own. I suspect that we will find a discrepancy with this claim, based on the other post you made. The low end Surface Pro 3 you mention has a 1.5 GHz chip branded as i3, and none of the macbook pros have that.
Re: (Score:2)
I think situation is very simple. AMD is in coma.
AMD makes great GPU's still. I bought an AMD RX 470 and it is about as fast as a nvidia 970 and can do 1080p ultra settings for $215 on anything I throw at it! :-)
The AMD fusion CPU and the higher end RX 490 and RX 5xx cards are just mere months away and look quite promising. The problem is Apple used the cheapest of the cheap outdated GPU
MS released Skylake last year (Score:2)
And it was a disaster. Drivers just plain weren't ready. BSODs for months, epic fail. Thankfully, Apple is smarter than that.
Re: (Score:2)
Apple also released a Skylake last year, in the iMac. I haven't heard of such problems (though possible they exist, I think I'd have seen people flipping their shit). These are new Skylakes, of course, and were not available last year.
Re: (Score:2)
Skylake STILL isn't ready on most Linux distributions. On Ubuntu 16.04 LTS, the kernel is missing Skylake support for several features that cause issues, from a black screen upon boot for 10+ minutes (monitor shows no signal shortly after the boot log messages stop (ie when it gets to the console login screen), and stays that way for 10+ minutes. The IPMI KVM console also shows no signal, but the IPMI serial console works), to IOMMU isolation issues. Don't plan to use the stock kernel if you intend to use I
Re: (Score:2)
Skylake STILL isn't ready on most Linux distributions. On Ubuntu 16.04 LTS, the kernel is missing Skylake support for several features that cause issues, from a black screen upon boot for 10+ minutes (monitor shows no signal shortly after the boot log messages stop (ie when it gets to the console login screen), and stays that way for 10+ minutes. The IPMI KVM console also shows no signal, but the IPMI serial console works), to IOMMU isolation issues. Don't plan to use the stock kernel if you intend to use IOMMU isolation on a Skylake Xeon for PCI passthrough or SR-IOV, everything gets lumped together in the same IOMMU group making it impossible. It doesn't have the Skylake patches yet, which only recently came out.
I have to manually add in a set of Skylake patches from newer kernels and recompile to get it to work each time there's a kernel update. Hopefully that'll be fixed when the 16.10 kernel is backported to 16.04, and I can switch to that. But if Skylake support is still iffy on current LTS Linux distributions, then forget about Kaby Lake. It's just not ready yet and will frustrate end users.
Shoot it isn't ready on Windows either :-)
Especially if you own a Windows 7 box as Intel seems to care about Windows 10. NVMe on 7 is quite buggy from what I hear. I am glad I am still on Haswell as it serves my purposes fine. I am just irritated I can't do GPU passthrough on my K series i7 but it is stable and works.
Hopefully AMD fusion which is about to come out can provide some competition. Intel could use it
Rewrite the headline, delete TFS and TFA (Score:4)
"Apple and Microsoft use Skylake processors in new computers due to Kerby Lake unavailability"
There, a headline which explains everything you need to know about the summary and the article, and is one word shorter to boot.
Re:Rewrite the headline, delete TFS and TFA (Score:4, Funny)
I am shocked, absolutely shocked that both Microsoft and Apple would cheat customers by not releasing computers powered by CPUs that are not yet available. I demand an explanation!
Re: (Score:2)
> Kerby Lake
Lets go to Kirby Lake instead. I hear that generation can inhale other chips and gain their powers!
Re: (Score:2)
If Kirby inhales you, does he get a cute little Guy Fawkes mask and the ability to troll forums?
"A Skylake dual core that draws a lot of power" (Score:2)
Yeah, about 4W at full load. Is that "a lot" these days?
Also, from what (little) i've read about Kaby Lake the improvements from Skylake are minor - power consumption is in fact expected to be identical.
I wonder when we'll see A-series MacBooks? (Score:2)
Somewhere in the bowels of 1 Infinite Loop I'll bet there's a mockup of a MacBook with an A10 processor. Or multiple A10 processors. Running a crude port of macOS. But because that would mean another round of porting legacy software over to the new chips it won't happen until they can get a good emulator experience. Seems to me that's where things should be headed, just basing on what's come up over the last few years.
Re: (Score:2)
I think it's safe to say that they've played with such a configuration. However, unless their R&D is way ahead of what they're shipping, such a device won't actually ship for a long time, if ever.
The biggest problem is that the A10 is still a two-core chip. It has four cores, technically, but two of those are slow cores for reducing power consumption. I don't know if it is possible to use all four cores, but even if it is, it would still be nowhere near as fast as a modern four-core Intel chip. The
Re: (Score:2)
The biggest problem is that the A10 is still a two-core chip.
It is 2+2 because it was designed for phones. It does not have to 2+2 if defined for a laptop and could be quad-core. Remember Apple ships variants of their Ax processors all the time. The AppleTV used a single core A5 which was only ever made for the AppleTV. Still I don't see it being as powerful as an Intel chip.
The second biggest problem is that the GPU in the A10 is designed to drive a 1920×1080 screen. Even the iPad Pro's GPU is designed to drive only a 2732 x 2048 screen. That's less than half the pixels on an iMac screen, and the iMac's GPU has to routinely drive up to two 3840x2160 UHD screens on top of that. So basically, the GPU performance would probably need to go up by an order of magnitude to replace what's there now, or else they would have to use an external GPU.
The 6 core GPUs drive a 1920x1080 display. Again for a laptop, there would be more room to squeeze in more cores to handle a bigger display. Also there is no reason that Apple has to use a PowerV
Re: (Score:2)
Yeah, but adding cores isn't free. The more cores you add, the more challenging it is to keep their caches in sync. Two cores are relatively easy. Four cores are considerably harder. Six or more cores to match the multicore performance of modern Intel chips are harder still. Obviously it can be done (because it has been d
Re: (Score:2)
Yeah, but adding cores isn't free. The more cores you add, the more challenging it is to keep their caches in sync. Two cores are relatively easy. Four cores are considerably harder. Six or more cores to match the multicore performance of modern Intel chips are harder still. Obviously it can be done (because it has been done many times), but the point is that cranking up the core count is a non-trivial piece of engineering.
Which is a problem for all multi-core CPUs and not just Apple. I would argue though optimizing two different set of cores (2+2) might be harder than 4 of the same core. My point still is that Apple has optimized the number of cores for each device sometimes removing a core. Apple could design a quad core Ax laptop CPU if it wanted.
The reason it makes sense for Apple to build their own chips for cell phones is because they turn around and build ten million of each model. It makes a lot less sense to spread higher R&D expenses (for a much more complex chip) across a tenth as many devices (or less).
Not really. They design their own chips because their requirements for each device can be optimized as opposed to accepting whatever design Samsung or even Qualcomm had made to wo
Re: (Score:2)
Of course they could. My point is that they would end up needing to modify the cores themselves significantly to ramp up the core count, and that those sorts of changes would, I suspect, be too significant to pay off wh
Re: (Score:2)
Of course they could. My point is that they would end up needing to modify the cores themselves significantly to ramp up the core count, and that those sorts of changes would, I suspect, be too significant to pay off when you're talking about a laptop.
Well ramping the core count is one way of boosting performance; however, I don't expect an Ax MacBook to be a powerhouse. Again, I expect Apple would replace the MacBook Air with it if they do it.
And no, the reason they design their own chips is that they can blow the doors off of what the other companies achieve in terms of power consumption by hand-optimizing the heck out of the core designs. On cell phones, that makes a big difference, and in quantities of tens of millions, the R&D cost per unit is small. On laptops, that makes a much smaller difference (because the batteries are huge by comparison) and the R&D cost per unit is relatively large.
But Apple is not doing this from scratch. They have a design already. Whether it is enough to power a laptop is a different question.
And yet every time I open up my Xcode project at work, Xcode sits there with a single CPU code pegged at 100% for a couple of minutes just to load the project, and several more minutes before it stops SPODing long enough to be usable, and basically the CPU is pegged at 100% for about an hour before indexing finishes. Real-world code doesn't always parallelize easily.
I'm not understanding your argument. First you are saying it's hard to do multicore CPUs and get it right. But you're also saying multicore usage in the real world does not work well.
Re: (Score:2)
The only downside to this decision is battery life (Score:1)
Yup. The only downside. In a portable device. Idiots.
Re: (Score:1)
New MacBook Pros Max Out At 16GB RAM Due To Battery Life Concerns [slashdot.org]
Re: (Score:1)