Qualcomm's Next-gen CPU for PCs Will Take on Apple's M-series Chips in 2023 (theverge.com) 100
Qualcomm is looking to seriously beef up its PC processors, with the company announcing plans for a next-generation Arm-based SoC "designed to set the performance benchmark for Windows PCs" that would be able to go head to head with Apple's M-series processors. From a report: Dr. James Thompson, Qualcomm's chief technology officer, announced the plans for the new chips at the company's 2021 investor day event, with the goal of getting samples to hardware customers in about nine months ahead of product launches with the new chip in 2023. The new chip will be designed by the Nuvia team, which Qualcomm had bought earlier this year in a massive $1.4 billion acquisition. Nuvia, notably, was founded in 2019 by a trio of former Apple employees who had previously worked on the company's A-series chips. The company is making big promises, too: in addition to offering competition to Apple's stellar M-series chips (which power its latest MacBook Pro and MacBook Air laptops and iMac and Mac Mini desktops), Qualcomm is aiming to lead the field for "sustained performance and battery life," too.
Diversity a good thing? (Score:4, Interesting)
Years back it seemed a good thing that a lot of the home PC's and Mac's made use of Intel (albeit a monopoly is not a good thing in itself).
It means there is more opportunity to run software designed for one platform on another.
Now it seems that the CPU world gets more diverse. I wonder if that is a good thing or not, but cannot answer this for myself.
Any thoughts on this?
Re:Diversity a good thing? (Score:5, Interesting)
Much depends on whether the processor change is also used as an excuse for other platform changes. Imagine, for example, locked down firmware that can only boot Microsoft software. The ARM processor itself is not to blame but may be enough of a break with the status quo that they may feel emboldened to make more strict lockdowns.
ARM might become a tricky platform, depending on how nVidia's acquisition attempt goes and their plans on how to manage it. For example, the very next release of cumulus after nVidia purchased it supported only nVidia switches (that they had separately bought from Mellanox).
The fact that Mac moving to x86 was a good thing was due to a few factors:
-IBM sucked at competing with Intel/AMD on desktop/laptop use cases, so it was good to move over.
-Mac could then run Windows natively
-The 'hackintosh' concept opened interesting possibilities, though in practice they just end up fighting Apple
Going to ARM makes people hopeful that more players could participate, In addition to Apple and Qualcomm, nVidia, and Marvell seem like likely companies to bring out an offering if the market is viable, and Intel and AMD may be able to participate should the day come that ARM software is preferred.
windows store only so no GOG & no steam = no g (Score:2)
windows store only so no GOG & no steam = no gamers
Re: windows store only so no GOG & no steam = (Score:1)
Re: (Score:2)
It's much ado about nothing.
Qualcomm can produce a better CPU/GPU SoC than Apple, but will never get the performance or scale to ever get mass adoption because there is no "QualcommOS", nor anything that can operate on a Qualcomm chip that isn't a proprietary distribution.
Re: (Score:2)
Windows supports it and ChromeBooks will certainly support it.
In a world where the common user treats the computer as little more than the thing that the browser runs in, so long as the best of breed browsers fully support it, there's a shot of it getting market acceptance.
Re: (Score:3)
Depends on what you mean by "market acceptance."
If you mean most people, who just use their webbrowsers, will be fine with it yeah. It'll do fine. If it's comparable to mid or high-end Intel chips in performance-per-dollar, and draws fewer watts; it will do very well in higher end applications. Which high-end applications depend heavily on the price-per-performance actually achieved. Incidentally: keep in mind that low-end engineering can do very well in performance-per-dollar, as long as the MBAs give them
Re: (Score:2)
next-generation Arm-based SoC "designed to set the performance benchmark for Windows PCs"
Ah, I see yer problem right there...
Re:Diversity a good thing? (Score:4, Interesting)
Non disclosure agreements? (Score:2)
Surely this would actually be a legitmate case of non-compete and non-disclosure agreements. People with A1 IP in their heads starting a company that will basically do the exact same thing they got paid for in developing the A1 tech to begin with? It's different than just using what you know when it's virtually a copy. And this sure sounds like it. But who knows. Qualcom has such a big patent portfolio it's gotten apple to back off before. And perhaps the last go-around they got some cross liscencing
Re: (Score:3)
There's a lawsuit. Or there was a lawsuit [regmedia.co.uk]. As of February/Marchish it was still going on [9to5mac.com], but I haven't been able to get anything recent. I suspect t was quietly settled right before the Nuvia acquisition by Qualcomm in July.
Note: In Cali non-competes are illegal. So you can actually just quit and start your own company doing the same thing. It's only an issue if you are doing work for the new company (in this case: Nuvia) whilst still employed by the old (Apple). Which was Apple's initial beef with this G
Re: (Score:2)
People with A1 IP in their heads starting a company that will basically do the exact same thing they got paid for in developing the A1 tech to begin with?
Uh ya. Any IP is the property of Apple, of course, but the knowledge you acquired to invent Apple's IP? That's yours. Your company doesn't own that.
Non-competes exist to try to clamp down on that fact. Fortunately for freedom loving humans, they're illegal in California.
Re: Diversity a good thing? (Score:2)
Nature is diverse, so it would be good for technology too. When I learned of the APU, I thought this was a good leap but I don't really know enough about ARM. It's likely good but not in all ways. ARM seems a general good route in general purpose computing especially where energy consumption is important. Is it better? That relies on the absurdity of the question. Is a cheetah better than a Lion? Within a domain, both are alpha predators. Likewise this is good for creating competition for ARM based general
Re:Diversity a good thing? (Score:5, Interesting)
What's old is new again. Back in ye olden days when I first got into computing, you had multiple platforms, some adhering to basic POSIX standards, some that were way out there. It took some effort by developers to build suitably cross-platform and cross-architecture code. But it certainly was done, and done a lot. I worked on both Xenix and Linux systems in the early 1990s, and by and large the basic libraries were similar enough that the most I had to do was just make the right Makefile flags were set.
As it is, even when OSs use the same architecture, the OSs can be sufficiently different that you're either going to have to recompile or install some sort of compatibility layer or virtualization if you well and truly want to have universal binaries. But a lot of code these days is written in languages like Python and Java, so, like Bash, Perl and sed scripts of the long ago time, you could often get quite a bit of production code working with very little effort.
Frankly I think it's good that the x86/64 world has some growing competition, and if it means we go back a bit in time for developers in having to pull their socks up and learn the tools that previous generations of coders used to get their software to compile on different machines, then so be it.
Re: (Score:2)
Re:Diversity a good thing? (Score:4, Insightful)
Diversity is an awful thing in almost that is foundational. The reason why desktop software is so easy to make today is because you only need to target one OS and one CPU instruction set to hit overwhelming majority of the market. Today, that's Windows/x64.
The more instruction sets and OS's you have to target, the less development resources are left for developing actual software to do what it's supposed to do. Because in real world (as opposed to utopian visions so popular today) resources are highly limited. Remember Itanium? There's a reason why we use AMD64 instruction set, even in intel machines today. Because unlike Itanium, AMD64 instruction set was fully backwards compatible with intel's x86. The unified de facto instruction set for desktop computers.
You do not want diversity in your baseline elements of functionality. You want high level of unification. Because that allows you not to worry about all the diverse problems that come with diverse baseline. You don't want a car market where every car has a completely different way of driving it from all other cars. You don't want a lot of shops each of which accepts only its own kind of payment system. Diversity is a very, very bad thing in core functionality. You want the opposite of diversity there, you want high levels of unification. You want cars that all have steering wheels, an accelerator pedal and a brake pedal, so that when you know how to drive one, you know how to drive all. You want all shops to accept same paper money and credit cards/etc payment methods that can be read in a single reader, so you don't need to carry a separate card and separate currency for each shop you plan to patronize on that day.
Highly unified baseline functionality means that you can actually build diverse products on top of it, because you don't need to commit massive amount of resources just to make sure that baseline works.
Re: (Score:3)
Ah yes, "it's all in the cloud" argument. Meanwhile in real world, chromebooks are ignored outside kids and elderly markets, and a lot of attempts at SaaS model through thin client failed miserably and keep failing. Instead forcing draconian DRM models used by a lot of major software houses today, because most people still use a full on PC at work.
Re: Diversity a good thing? (Score:1)
Re: (Score:2)
"Driving is more than just SUVs these days. The market share for sedans is x%, plus another y% or so for Teslas. That leaves SUVs under z%, trending downward".
Ok. Doesn't interact with my argument about driving in any way though.
Re: (Score:1)
You can drive a car so you can drive a boat, plane, train, truck, motorbike and skateboard...
Oh wait, those things aren't all the same. Just like not all computing devices are the same either.
Why do I want my PC dumbed down to function like a tablet? My phone doesn't have a keyboard and mouse. My console is different again.
Re: (Score:2)
Apple was happy for diversity when it moved from Motorola to Intel. And, when Intel remained behind (on performance per watt), it was happy for the diversity that allowed it to move to ARM/M1 platform.
Also, we have the RISC-V platform waiting for its turn - depending on what path USA uses in regard to negating technology to China, RISC-V might be another usable way forward.
Re: (Score:2)
Apple is the exception because apple doesn't sell computing devices. They sell fashion accessories that are meant to express client's status that also do computing on the side. In fashion, differentiation from the masses is the most important aspect.
Re: (Score:3)
The more instruction sets and OS's you have to target, the less development resources are left for developing actual software to do what it's supposed to do.
You have a point regarding OS's - but not regarding instruction sets, for that we have compilers.
Re: (Score:1)
And that's how we get a lot of modern software being absolute garbage tier in optimizations and security. Turns out "just use the relevant compiler" without knowing a lick about the CPU instruction set you're targeting results in abysmal performance and security holes galore.
Re: (Score:2)
The instruction set has absolutely nothing to do with security problems in modern software. Nor has performance. For most modern CPUs a mere human can not write better code than the compiler generates from a high level language.
If you were a software developer: you knew that :P
Re: (Score:2)
Yes, yes, just like Germany controls wind.
In real world on the other hand, software developers that aren't grossly incompetent know that both are true, and don't pretend that "compilers can write better code" also means "not knowing how CPU architecture works and how to optimize for it makes for better code".
And then are shocked, SHOCKED that their software is choke full of bugs and performs like crap.
Re: (Score:2)
And then are shocked, SHOCKED that their software is choke full of bugs and performs like crap.
Bugs you are writing yourself. Has nothing to do with the compiler.
software developers that aren't grossly incompetent know that both are true, and don't pretend that "compilers can write better code"
Since the late 1990s, when RISC started to rise, most compilers actually produce better code than a assembler programmer can do. Has to do with opcode rearrangement, pipelines, cash lines etc. p.p.
However, if you ar
Re: (Score:2)
Wait, so your argument about this is... that "they all compile to the same byte code" and "coder has no influence on code generation".
Yeah, this is on par with nonsense you peddled before. There are indeed no winters with temperatures below -20C in Northern Europe because of global warming.
In real life on the other hand, top tier exploit builders already target open source compilers and socially engineer people teaching so that common coding methods combined with certain bad coding practices generate specif
Re: (Score:2)
Wait, so your argument about this is... that "they all compile to the same byte code" and "coder has no influence on code generation".
If you have a counter example. Then give one.
There are indeed no winters with temperatures below -20C in Northern Europe because of global warming.
If you have a counter example. Then give one. As stupid as you are, you must be below age of 35. So: you wont have a counter example.
But you wouldn't know a lick about that, just like you wouldn't know lick about wind power, or glo
Re: (Score:1)
x64 in the desktop style console and arm on mobile. Did you have a disagreement?
Re: Diversity a good thing? (Score:2)
Re: (Score:2)
That's like saying that x64 proves it wrong because AMD and Intel have fundamentally different cores and core interconnects.
Just no.
Re: (Score:2)
Hush Chinabot. You don't even get those cores.
Re: (Score:3)
Because software compatibility doesn't mean as much as it used to. Everything's in the cloud and your local computer is becoming a dumb display device. Not for everyone, but there are enough people whose computing needs are completely satisfied with just a browser. And many apps that are nothing more than a packaged up browser, varieties that all run under a common framework, etc.
The idea that you need Intel to run Microsoft Office is also dead; in fact Microsoft wants you to stop buying the desktop Offi
Probably (Score:5, Informative)
Performance and compiler tools have improved to the point you can put everything on top of a couple layers of hardware abstraction and not make the system completely unusable. You could see this emerging with Android where you had Intel, MIPS and ARM architectures running, if not perfectly synchronized performance-wise, at least close enough to be useable for most tasks.
It was a no-brainer for Apple, who had switched their toolchain over to LLVM/Clang/Swift years ago, which natively supports ARM and X86. Unless you are doing something weird, recompiling from X86 to ARM using Clang for MacOS involves changing a compiler switch.
Re: (Score:2)
While Linux might be less useful for the desktop than Apple's MacOS, it runs on lots of plaforms - PA-RISC, Motorola 680x0 (former Apple), ARM, ARM64. DEC Alpha, some Amiga, M1 (current Apple), x86, x64 and plenty of others.
The Chromebooks run on Intel and ARM.
Android runs on ARM and x86. The initial steps to run it on RISC-V were done too.
Re: (Score:2)
What hardware an OS runs on is irrelevant. The only question is will your software run. You don't use an OS, you use the software running on it.
This in many ways shows Linux is very far behind Windows and MacOS, both of which have native emulation. Just firing up Ubuntu on an x86 machine and attempting to install and run a .deb package intended for ARM will not leave you with a happy feeling of success.
Re: (Score:2)
Just firing up Ubuntu on an x86 machine and attempting to install and run a .deb package intended for ARM will not leave you with a happy feeling of success.
Because one OS is clickity-click, maybe it worked, and the other OS requires more specific steps to be taken.
You can quite easily run ARM software on an x86 using Linux. You just have to configure stuff first. This situation is intentional; linux operating systems do not make a bunch of attempts to find a way to run a random piece of software when you click on it. see also: computer security
Re: (Score:2)
I didn't say you can't do something. I said its quite far behind. The fact that you need to configure stuff is precisely the problem. Users aren't interested in fighting with their OS, they are interested in using their software.
I get it man, I really like Linux. I'm a tinkerer, I build things myself, and I get a kick out of the challenge of configuring something. I just don't recommend it to anyone or pretend it is suitable for non tech end users.
Re: (Score:2)
No; having to configure stuff is not a problem. It is how we designed it. It is what we want.
You want something different; windows. That's fine. Use your windows.
You're "insane" (literally) to think that your preferences determine what my problems are.
Linux users don't fight with their OS. They learn how to use it, and they enjoy that it only does what they told it to do.
Re: (Score:2)
No; having to configure stuff is not a problem. It is how we designed it. It is what we want.
That's fine. As soon as the Year of Linux on desktop meme stops and companies stop trying to make it more "user friendly" or "desktop oriented" I'll agree with you. Until then those very people doing the designing are disagreeing with your point, and based on their current trajectory of making Linux a friendly desktop OS there is only one conclusion as far as emulation goes: It's currently deficient compared to the competition.
You're "insane" (literally) to think that your preferences determine what my problems are.
It's not my preferences. I have no skin in the game. I am labelling the complexit
Re: (Score:2)
It's not my preferences. I have no skin in the game. I am labelling
This claim makes you even crazier. You can't comprehend that linux is very popular with linux users.
Nobody gives a shit that you don't use it. It doesn't affect us at all.
Systemd is very popular with actual sysadmins. Including on slashdot. We just have a bunch of neckbeards who are scared their SysV scripts will stop working. Because they can't read well enough to figure out that those will keep working, they're just scripts, you run them after the systemd stuff.
If systemd was actually unpopular with lin
Already Lost (Score:3)
Re:Already Lost (Score:4, Insightful)
Re: Already Lost (Score:3)
Exactly, Qualcomm CPU design team has never been behind Apple, they just had different constraints especially budget. For example, compare the die size of Snapdragon 888 with Apple A14 .. snapdragon has 20% less transistors of course it will perform 15% on certain benchmarks. Also Qualcomm chose to optimize for video performance.
Re: (Score:2)
Exactly, Qualcomm CPU design team has never been behind Apple
They have: they completely dropped the ball on 64 bit. Now they use more or less off the shelf core designs from ARM. In fact the 888 gets its speed from the Cortex X1, where ARM broke with tradition and went for a core which doesn't minimize die area.
Re: (Score:2)
Qualcomm can make as large a chip as anybody else.
Unfortunately, the development cost must be spread over an immense number of devices. In a way, the first processor costs one billion, and every one after that is $1 a piece. So if you can only sell a million processors, every one is $1000. If you sell a billion, every one is $2.
Re: (Score:2)
Re: Already Lost (Score:2)
Re: (Score:3)
They said with the 'M-series', which I take to mean they would target the contemporary match for whatever Apple roadmap comes out with.
Also, there's enough to suggest Apple *might* have a roadmap problem, as they experienced a brain drain in that area. It's hard to say until we see products come out without the staff people are alleging to be key engineers that are gone now, but there's a possibility.
Re: (Score:1)
Re: (Score:2)
What's Qualcomm's share right now? Windows on Snapdragon hasn't exactly taken the world by storm.
Re: (Score:3)
Apple is not interested in a large market share - they're in the part of the market where they can get a large profit - the top of it.
Remember that the operating system is included in the price, and operating system upgrades are free.
Re: (Score:1)
According to StatCounter, Apple has around 16% market share (however you define this) of the worldwide desktop OS market right now. It is around 25% in North America. Without really trying it seems, Microsoft is giving ground with the poor quality of Windows.
Another question is what OS a Qualcomm PC equivalent will run. Apple does not share MacOS. So will Qualcomm try to get Microsoft on board? Or perhaps create a Linux distribution?
Either will make for extra development costs and time.
Re: (Score:2)
According to StatCounter, Apple has around 16% market share (however you define this) of the worldwide desktop OS market right now. It is around 25% in North America.
Uh, lol.
That 25% is for mobile devices. Meaning phones and tablets.
macOS is at 6.82% according to StatCounter.
Re: (Score:1)
You forgot to filter for "Desktop". Over all platforms, you would be correct.
BTW, if you filter for mobile and tablets, Apple is even stronger with iOS. But I think that is a bit off topic, this discussion is primarily about desktop.
Re: (Score:2)
The numbers I was looking at were artificially deflated due to being in the running with Android and iOS.
Re: (Score:2)
Will Apple sell the M1 to others? It may be on Apple Hardware but that is still the very high end of laptops, so there's still a huge market for affordable and simple laptops.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
M1s are massive slabs of silicon, trying to sell them without Apple hardware profit margins is a lost cause. Apple found a way to cream off the profit of almost the entire high end of the market, now they pretty much have TSMCs two most advanced nodes on lockdown for consumer products (Intel will buy a little 3nm for server products, not consumer). They are snowballing into a monopoly on consumer computing hardware.
Modern ecosystem lock in is creating far more dangerous monopolies than in the past, Google s
Re: (Score:1)
Also, designing and building a processor that can compete with the Apple M1 will probably take more that two years, if one starts from scratch. My impression from other news is that Intel and AMD need around two years for a new generation, but they have a lot of existing knowledge to build on.
With what software? (Score:4, Interesting)
A big part of the Apple advantage has been the ability to combine hardware and software, to look at the software and identify places where hardware speed-up makes sense, and vice-versa, to move functions to software where hardware would be inflexible. This will be particularly noticeable in performance/watt, where power optimization is a real system (hw+sw) function.
Will Google and Qualcomm team up to do this for Android on Qualcomm ARM? Will Microsoft team up with Qualcomm to form WinComm?
And on the business side, will Qualcomm continue its patent licensing scheme? How will Qualcomm deal with patents in the SoC space that are owned by other companies (Intel, Apple, Samsung, etc)?
Seems to me there's a lot of questions and potentially a lot of risk associated with Qualcomm's movement into this space.
I don't think software is Apple's advantage per se (Score:3)
Re: (Score:2)
That's an interesting point! For either Android or Windows, will the software/systems vendor have to implement something like Apple's Rosetta to make older applications run on the new system (HW & SW)? Apple has pulled this off a couple of times (68k to PPC, PPC to Intel, now Intel to M1), that's corporate knowledge that I'm not sure is matched elsewhere in the industry.
Re: (Score:2)
Microsoft Windows has a much larger body of for purchase software. Games are the obvious thing but there's an insane amount of software out there sold to businesses that feel like a thousand and one weird niches. I used to support the print shop that had a invoicin
Re: (Score:2)
I'll grant you, it's not as performant as Rosetta2 in my tests, but to be fair, Rosetta2 gets to cheat since they can change the memory ordering of the CPU and qemu can't (without a kext).
Re: (Score:2)
Will Microsoft team up with Qualcomm to form WinComm?
No, why would they? /s https://www.microsoft.com/en-u... [microsoft.com]
Re: (Score:2)
https://tech.slashdot.org/stor... [slashdot.org]
Support! (Score:3)
Re: (Score:2)
With support like that they'll never be a serious competitor to anyone.
It's almost like you don't realise that Qualcomm already are the completely dominant player in the ARM SoC world and that basically every major company touching something ARM related have used their CPUs already.
Yeah they may be shit, but so is Microsoft and we can't pretend the world is ignoring them and there's not 1.3billion windows machines out there.
Qualcomm's ability to be a serious competitor will have nothing at all to do with their support.
Queue the lawsuits (Score:2)
Re: (Score:2)
Given how much love there is between Apple and Qualcomm, I foresee lawsuits coming ahead. I make no claim to whether any notion of 'stolen trade secrets' or 'non-compete clauses' have any merit. Nevermind, too, that Qualcomm is looking to the desktop Windows market, whereas Apple is, well, Apple. Lawsuits are just how tech behemoths say
Target Customers? Chromebooks (Score:2)
Well, obviously Apple isn't looking to jump to a different CPU right now, so they're not going to be buying these chips. Windows PCs are pretty set on amd64 compatibility, though I suppose if emulation is fast enough, they might be able to get some to switch, but I wouldn't hold my breath.
That leaves Linux servers and Chromebooks. Data centers and laptops both are serious about minimizing power consumption, so moving away from x86 makes sense. Both are environments where changing CPU architecture isn't t
This seems as if... (Score:3)
...they are closing the barn door after the horses have bolted. Better late than never I guess, but Apple has been working on their silicon for a decade or more. It seems unlikely that Qualcomm will be able to go "head to head" with Apple silicon in a mere few years.
Re: (Score:2)
Re: (Score:2)
There's more than just chips in new Apple's hardware. It's the incredible optimization it's manage to create that will hold back a company that has been building for the lowest common denominator for virtually its entire history.
Re: (Score:2)
Better late than never I guess, but Apple has been working on their silicon for a decade or more.
LOL Wut. Apple's first ever ARM CPU was the A4 which was released with the iPhone 4 in 2011. That was 4 years *after* Qualcomm released their first ARM CPU in 2007.
No, it won't. (Score:3)
Qualcomm didn't take a page out of nVidia's book, they wrote the damn thing on how to be a closed source bastard. Need a datasheet? Sign a HUGE NDA and you can. Want to include their drivers? HUGE NDA and it's still closed source binary blobs.
Qualcomm is it's own worst enemy when it comes to this stuff and they are absolutely destroing the product they are trying to push before it's even out the door. The most likely outcome is that devices using this will have a non-replaceable kernel and no support.
Apple can outspend them too much (Score:2)
Apple will simply buy TSMCs two nodes out entirely, good luck competing 2 nodes behind.
Re: (Score:2)
Around 1/2 for desktop, Intel got stuck on 14nm for a long long time.
Apple won't make Intel's mistake. Apple wins by process advantage and they'll keep buying it.
Re: (Score:2)
Re: (Score:2)
AMD will get "6nm" (not even as good as N7+, but it sounds decent at least) and Intel will buy a little 3nm for server products.
Re: (Score:2)
They would have to pay a massive amount of cash to TSMC for that, and then they'd both be open to lawsuits up the wazoo.
Interesting (Score:2)
That'll be interesting. Would anyone run Windows if all the software has to be recompiled for ARM anyway?
Re: (Score:2)
Windows 10 on arm64 featured a preview of x86_64 to arm64 translation. Microsoft have recently announced that this feature will ship in Windows 11.
I don't know if they are taking Apple's Rosetta 1 route of dynamic translation at runtime (that incurs a significant performance penalty) or if they're using the Rosetta 2 technique of taking the x86_64 binary and treating it as source code to recompile it to arm64 - which can perform at near-native speeds.
Re: (Score:2)
which can perform at near-native speeds.
Eh, I haven't seen that at all.
It's good. Perfectly usable. But particularly in larger GUI apps, you can feel when it's an x86_64 app.
Re: (Score:2)
Would anyone run Windows if all the software has to be recompiled for ARM anyway?
Compiled for ARM? That's news to everyone who owns a Surface Pro X.
Re: (Score:2)
Would you consider the emulation on a surface "the performance benchmark for Windows PCs?"
Re: (Score:2)
Yes. Because it's the only emulation of its kind available, so by definition it is the benchmark. It's not fantastic, but it is perfectly functional. Very few people have issues with its performance, far bigger of an issue is the lack of x64 support.
Now I have a question for you: In what application do you think the performance hit from the emulation wouldn't be acceptable? I am going to deduct a point from you for every time you mention a power user use case, or a use case linked to you earning money for w
Re: (Score:2)
Qualcomm isn't claiming they're going to be the performance benchmark for Windows on ARM. They're claiming they're going to be the performance benchmark for Windows PCs. No, your tablet isn't a performance benchmark for PCs, and emulated code isn't ever going to be.
custom instructions (Score:2)
What they need is a Platform (Score:3)
What allowed the "IBM PC Compatible" to dominate was that it was a common platform. It wasn't just the Intel CPU/Instruction set. It was also the BIOS and the fact that you always knew exactly how the machine would boot, that the serial port was at port 0x3FH and the HD controller was IRQ 14, etc. That means anyone can write an OS for it, and anyone's hardware that followed the specs could run that OS.
As it is, every ARM SoC is different from every other, every ARM based device is different from every other, and you can't just take the same copy of Windows that run on a Microsoft tablet and run it on a Samsung (for example). This puts up barriers and prevents the rapid competition that brought PCs to the masses in the 80s and 90s. Having a standard platform will open the same "home built" market for ARM based PCs that we have for x86/x64.
If someone with clout could enforce an "ARM PC" platform that all the HW/OS manufacturers can compete on, I expect we'd get a second revolution in computing hardware, possibly beating Apple again in the same way. Without that, ARM will continue to be confined to specific devices that the manufacturer decides to build.
Re: (Score:2)
As it is, every ARM SoC is different from every other, every ARM based device is different from every other, and you can't just take the same copy of Windows that run on a Microsoft tablet and run it on a Samsung (for example). This puts up barriers and prevents the rapid competition that brought PCs to the masses in the 80s and 90s. Having a standard platform will open the same "home built" market for ARM based PCs that we have for x86/x64.
Once upon a time, this was invariably true. Like, really fucking bad. Every SoC literally had its own MMU implementation.
These days, a lot more is standardized.
You can run an arm64 image of windows or linux on any number of semi-modern Cortex-A* cores as long as you have a device tree, or an EFI blob.
Apple of course has their own cores, which are not Cortex-A*, and do not use any of the standardized peripherals (They went so far as to make their own fucking interrupt controller... something I think the