Nvidia To Make CPUs, Going After Intel (bloomberg.com) 111
Nvidia said it's offering the company's first server microprocessors, extending a push into Intel's most lucrative market with a chip aimed at handling the most complicated computing work. Intel shares fell more than 2% on the news. From a report: The graphics chipmaker has designed a central processing unit, or CPU, based on technology from Arm, a company it's trying to acquire from Japan's SoftBank Group. The Swiss National Supercomputing Centre and U.S. Department of Energy's Los Alamos National Laboratory will be the first to use the chips in their computers, Nvidia said Monday at an online event. Nvidia has focused mainly on graphics processing units, or GPUs, which are used to power video games and data-heavy computing tasks in data centers. CPUs, by contrast, are a type of chip that's more of a generalist and can do basic tasks like running operating systems. Expanding into this product category opens up more revenue opportunities for Nvidia.
Founder and Chief Executive Officer Jensen Huang has made Nvidia the most valuable U.S. chipmaker by delivering on his promise to give graphics chips a major role in the explosion in cloud computing. Data center revenue contributes about 40% of the company's sales, up from less than 7% just five years ago. Intel still has more than 90% of the market in server processors, which can sell for more than $10,000 each. The CPU, named Grace after the late pioneering computer scientist Grace Hopper, is designed to work closely with Nvidia graphics chips to better handle new computing problems that will come with a trillion parameters. Systems working with the new chip will be 10 times faster than those currently using a combination of Nvidia graphics chips and Intel CPUs. The new product will be available at the beginning of 2023, Nvidia said.
Founder and Chief Executive Officer Jensen Huang has made Nvidia the most valuable U.S. chipmaker by delivering on his promise to give graphics chips a major role in the explosion in cloud computing. Data center revenue contributes about 40% of the company's sales, up from less than 7% just five years ago. Intel still has more than 90% of the market in server processors, which can sell for more than $10,000 each. The CPU, named Grace after the late pioneering computer scientist Grace Hopper, is designed to work closely with Nvidia graphics chips to better handle new computing problems that will come with a trillion parameters. Systems working with the new chip will be 10 times faster than those currently using a combination of Nvidia graphics chips and Intel CPUs. The new product will be available at the beginning of 2023, Nvidia said.
How about dealing with a GPU shortage? (Score:3, Insightful)
Re:How about dealing with a GPU shortage? (Score:5, Insightful)
Appears someone important at Nvidia bought into drivel management consultants were pushing. I am skeptical that Nvidia would be able to compete in the cutthroat CPU market seeing how they are not even able to cope with increased demand in GPU market. They have what is essentially a license to print free money and they can't even take full advantage of that!
How is Nvidia suppose to deal with their GPU shortages when they're a fabless company who outsources their GPU manufacturing to TSMC?
Re:How about dealing with a GPU shortage? (Score:2)
By announcing plans for adding more chips to their product mix and using that to generate funding to build their own fab.
Re:How about dealing with a GPU shortage? (Score:3, Insightful)
It takes years to build a fab and get it up and running. That will not address the immediate GPU shortage.
And you don't just acquire the skills needed to make advanced 5nm and smaller chips (what Nvidia needs) overnight. There's a reason why TSMC and Samsung are the only ones capable of successfully making them right now.
Re: How about dealing with a GPU shortage? (Score:4)
> And you don't just acquire the skills needed to make advanced 5nm and smaller chips (what Nvidia needs) overnight.
You may well not even acquire those skills over 40 years of running the most advanced fabs in the world.
Evidence - Intel
Re:How about dealing with a GPU shortage? (Score:5, Informative)
Nvidia has no expertise in fabbing ICs. They have no ability to build a fab nor run one.
The cost of getting into the game is astronomical. TSMC is investing $100B over the next five years. Nvidia's annual profit is $6B.
Companies with decades of fabbing experience and expertise are failing and getting out of the business, or only using legacy-scale technology (14 nm or larger).
Designing ICs and fabbing ICs are two completely different businesses. The only company left that tries to do both is Intel, and Intel is not doing well.
Re: How about dealing with a GPU shortage? (Score:2)
Re: How about dealing with a GPU shortage? (Score:2)
Would you happen to know where Huawei has its chips fabbed?
They are fabbed in Shanghai by SMIC at 14 nm.
Re: How about dealing with a GPU shortage? (Score:2)
Re:How about dealing with a GPU shortage? (Score:2)
Re:How about dealing with a GPU shortage? (Score:2, Redundant)
Build their own fab! Ahahahaha. Good one. Yes, because that's what one does when one needs to make more chips, they just "build their own fab". You guys crack me up.
The guy who said "hey, we should build our own fabs" got laughed out of the NVidia headquarters.
Re:How about dealing with a GPU shortage? (Score:2)
Not everyone can do it. But the world can handle at least one more. If they announce such plans, nobody else would try because it would be a losing proposition once the pandemic shortage ends.
There just has to be one company out there crazy enough to try to get funded and go for it. Even if it triples the size of their company in the process, it's still easier than a new startup trying to do the same.
Re:How about dealing with a GPU shortage? (Score:2)
Not everyone can do it. But the world can handle at least one more.
There is already one more: Samsung has a 5 nm fab.
If they announce such plans, nobody else would try
Nobody else would try because they would be laughing too hard.
because it would be a losing proposition once the pandemic shortage ends.
The lead time to build a fab is 5 years if you know what you are doing. Nvidia doesn't know anything about building or running a fab.
TSMC is investing $100B in new fabs, some at 3 nm. They would be on-line long before Nvidia could make their first chip.
There just has to be one company out there crazy enough to try to get funded and go for it.
Nobody would fund it.
Re:How about dealing with a GPU shortage? (Score:2)
By announcing plans for adding more chips to their product mix and using that to generate funding to build their own fab.
Building their own fab? Like it's just that easy. Even Apple doesn't have their own chip fabs and Intel that has decades of experience in chip fabrication is lagging behind TSMC and Samsung. How on earth do you think Nvidia, a company with no experience in fabricating processors, is just going to "build their own fab"?
If they have the money then they are much better off investing in TSMC or Samsung, companies that do have experience with cutting edge processor fabrication, to build out more capacity for them rather than trying to compete with them.
Re:How about dealing with a GPU shortage? (Score:1)
Re:How about dealing with a GPU shortage? (Score:2)
By building into their silicon hardware counter measures to them being used for computing hashes, either block it outright, or throttle it after detection of long term hashing.
So you're suggesting that the way for them to get the best use out of their current ability to print money is to smash the printing press with a hammer so you can have a cheap GPU?
Good luck with that.
Re:How about dealing with a GPU shortage? (Score:1)
Re:How about dealing with a GPU shortage? (Score:1)
Re:How about dealing with a GPU shortage? (Score:2)
The price of cryptos has inflated so much that it is profitable to GPU mine some of them.
Yes, that should tell you something about the irrationality of this raise.
When the craziness will subside and price get lower again, only ASICs will be profitable, and even then, only the ones made on smaller lithography. Of which there won't be many as the fabs these days are busy with orders for everything but ASIC miners, and thus the makers can't produce more.
Re:How about dealing with a GPU shortage? (Score:2)
How is Nvidia suppose to deal with their GPU shortages when they're a fabless company who outsources their GPU manufacturing to TSMC?
Fab companies have limited capacity, that much is true. Companies like Nvidia, Apple, AMD and Qualcom compete for the capacity. It works on a bid system, the company that offers the most is the one that gets more chips made.
So Nvidia can absolutely get more chips produced, they just have to up their bid, or engage with more manufacturers. They are not powerless, they just don't want to pay more.
In the long term this will balance out. TSMC profits from the bidding war, and they are currently investing the money in expanding their capacity.
Re:How about dealing with a GPU shortage? (Score:2)
Yeah! Totally not at all like all those other companies that are having absolutely no trouble whatsoever keeping up with massive increases in consumer demand during a global pandemic that's causing shortages of everything from semiconductors to shipping containers at every step in the supply chain! You know, companies like ummm [theguardian.com] ...?
Re:How about dealing with a GPU shortage? (Score:2)
Just disregard that please, replied to wrong comment
Re:How about dealing with a GPU shortage? (Score:2)
Umm because diversification of your portfolio of product as long as its not to far from your core competencies is usually a good idea.
After all if your only big money maker is GPUs and your best argument for your product is more performance/dollar then those other guys there is always a risk your get leap leapfrogged. Ask Intel.
At least if you have a solid set of complementary products you might be able to push stock of your primary product based on being a single source vendor, integration, or the complementary products being superior themselves.
Re:How about dealing with a GPU shortage? (Score:2)
Also it's Acorn RISC Model (ARM) based, a company they are attempting to acquire even though I think it is unlikely (I don't see the EU approving it). That said, ARM manufactures nothing and licenses its technology to pretty much everyone. This highly suggests these machines will target the server market, though both Apple and Microsoft have taken steps to move end users to ARM, so that space should open up.
Apple jumped in with both feet abandoning Intel, Microsoft is being a little more careful, with Surface Pro X being its first mainstream ARM PC (competitors have released in that space for years, usually running Linux, for example, Raspberry Pi) and releasing that with only x86 emulation compatibility (x64 preview was released on Dec 10, 2020, haven't heard it fully released yet) was a pretty bold move for a computer in the +$1200 space. They also released a Surface with a custom AMD Ryzen last year and apparently have a secret project with a possible third alternative (possibly Snapdragon, which has ARM and non-ARM versions).
Re: How about dealing with a GPU shortage? (Score:2)
Re:How about dealing with a GPU shortage? (Score:2)
Don't forget, Intel is quietly working on discrete GPUs as well.
Re:How about dealing with a GPU shortage? (Score:5, Funny)
We can't forget, we're reminded every decade.
Re:How about dealing with a GPU shortage? (Score:2)
Re: How about dealing with a GPU shortage? (Score:2)
They still do? They started in late 1990s
Re: How about dealing with a GPU shortage? (Score:2)
One way if dealing with this is by offering CPUs with integrated graphics. Buttcoin miners will not invest so much into CPUs where 90+ percent of cost is formed by non-GPU functionality. Yes, it will be slower than discrete graphics, but gamers and developers will adapt.
Re:How about dealing with a GPU shortage? (Score:2)
Yeah! Totally not at all like all those other companies that are having absolutely no trouble whatsoever keeping up with massive increases in consumer demand during a global pandemic that's causing shortages of everything from semiconductors to shipping containers at every step in the supply chain! You know, companies like ummm [theguardian.com] ...?
Just when you thought (Score:2)
Re:Just when you thought (Score:5, Funny)
> Just when you thought the news couldn't get any worse for Intel.
Intel's problems are largely speculative.
Re:Just when you thought (Score:5, Insightful)
It's not good news for Intel or AMD.
AMD stock dropped 3% on the news
Re:Just when you thought (Score:2)
So what fantastic news did AMD have that made it's stock skyrocket by 3% close to two weeks ago? 3% is a completely irrelevant move in the stock market. I mean just look at the past 4 weeks, there's been 3% moves on the 17th March, on the 22nd March, on the 31st of March, then it was stable for 2 weeks and now another 3% move.
If you look at AMD's stock price over this time frame you couldn't even identify the NVIDIA announcement, to say nothing of extending the view back a few months (AMD is 20% down from where they were in mid Feb).
Comment removed (Score:2)
Re:Just when you thought (Score:1)
I remember another company that decided to bequeath the low-end server market to another company, thinking that they would continue to profit on the higher-end systems.
SUN Microsystems ended up imploding and getting sold off to Oracle in a firesale, with Intel pretty much being able to have dominance in the server space for the last decade.
This stupidity will repeat itself, only with Intel taking place of Sun. I'm not saying that Intel will roll over and die - but they will have a much harder time having to defend against both AMD and NVidia / ARM - regardless of the NVidia + ARM merger being completed or not.
In general - I've never understood why companies get this mentality... if it's profitable business, and is in line with your core, keep it! The $5 you're keeping is $5 competition isn't getting! Yea, it maybe loose change, or a rounding error - but you're already making it! That $5 company today may be your $500 customer tomorrow. But even if they aren't, $5 is $5.
Re:Just when you thought (Score:2)
Nvidia is in an excellent position to make ARM more relevant in the compute-heavy part of computing because of their GPU tech.
People already doing most of their heavy lifting on GPU will be able to get the same performance for less money by avoiding paying for a full x86 processor.
Re:Just when you thought (Score:2)
As I commented elsewhere, Microsoft has a preview of Windows with x64 compatibility out and has x86 support already (I've heard Adobe software still has issues, so it isn't ready for prime time yet). Apple has already fully moved into that space with Mac and I believe they have both an emulator and Rosetta 2, which converts apps to native ARM on install (that still doesn't mean there won't be app problems - for example, VirtualBox depends on x86/hypervisor/etc.). Linux has multiple ARM based flavors but 3rd party binaries not in the distro can be hit and miss (I usually build source anyway, so no real issues for me). Also QEMU still doesn't support x86_64 AFAIK (the emulator that I usually find installable with distributions).
Re:Just when you thought (Score:2)
Yes, because NVidia will produce these advanced chips on either Unicorn farts or the slot they'll be scheduled in TSMC's fabs in..(checks watch) 2031.
Slight exaggeration, but the fact is TSMC's fabs can only make so much shit. Neither AMD nor Intel need be overly worried about this unless they literally fucking sit still for the next 3 years.
Re:Just when you thought (Score:1)
Thanks for turning this into yet another red/blue culture-war debate. (Reagan was often wrong my opinion, by the way.)
Anyhow, a chip shortage is good news for almost ALL cpu companies, even laggards, as they can raise prices and have bigger profit margins. The trick using this temporary revenue boom to out-grow the competition before things settle back to normal.
It's like an automobile race where the good news is you get a free rocket booster. The bad news is so does your competition.
Re:Just when you thought (Score:2)
I couldn't agree with you more on the benefits of competition. I have no doubt that the "shortage" will soon be resolved - the incentives are high.
But the intervention of Joebiden will almost certainly end up being anti-competitive, and encourage the formation of a cartel. The shortage won't end any quicker, and temporarily high prices will become permanent.
Politicians will politic, and the base will be roused.
Re:Just when you thought (Score:2)
Ah yes, Saint Reagan. This is the man who decided that the Fed. Gov. was so bad they needed to turn over some of its functions to the Beltway Bandits because they certainly will make it less expensive, yes? So he basically screwed up the Fed. Gov. and then the rest of the Republicans starting campaigning on the Fed. Gov. doesn't work. Their industry campaign contributors were only too happy to belly up to the bar.
Now we have a underfunded IRS, infrastructure that's falling apart, mega-corporations that can push the rest of society around, and a boy-child who instigated an attack on the Capitol because he threw a temper tantrum about losing the election. The response of the Republicans? Why, stop those damn Democrats from voting.
Re:Just when you thought (Score:4, Informative)
It' going to get worse. https://finance.yahoo.com/m/df... [yahoo.com]
"The nine most terrifying words in the English language are, 'I'm from the government and I'm here to help." -- Ronald Reagan
How did the rest of the quote go?
I think you all know that I've always felt the nine most terrifying words in the English language are: I'm from the Government, and I'm here to help. A great many of the current problems on the farm were caused by government-imposed embargoes and inflation, not to mention government's long history of conflicting and haphazard policies. Our ultimate goal, of course, is economic independence for agriculture, and through steps like the tax reform bill, we seek to return farming to real farmers. But until we make that transition, the Government must act compassionately and responsibly. In order to see farmers through these tough times, our administration has committed record amounts of assistance, spending more in this year alone than any previous administration spent during its entire tenure. No area of the budget, including defense, has grown as fast as our support for agriculture.
Oh, that's right.
Source: https://www.reaganfoundation.o... [reaganfoundation.org]
Re:Just when you thought (Score:2)
And we have a winner! Bravo!
As an aside, it is gratifying to see agribusiness back on its feet and no longer dependent on taxpayer handouts, thanks to the Gipper.
Re:Just when you thought (Score:2)
I don't see any sarcasm tags, especially with the amount of taxpayer handouts to farmers that were done over the last 4 years.
Comment removed (Score:2)
AMD Laptop GPUs? (Score:5, Interesting)
Re:AMD Laptop GPUs? (Score:2)
AMD can get away with not having raytracing and MLP accelerators for the moment, but they really need an alternative to DLSS (build on something faster than big MLPs, such as TSVQ). Without it they have a hard time competing.
Re:AMD Laptop GPUs? (Score:2)
Raytracing seems more important to me than DLSS. I'm not sure DLSS does anything except in some specific instances.
Re:AMD Laptop GPUs? (Score:3)
> I'm not sure DLSS does anything except in some specific instances.
DLSS is basically "smart upscaling" or smart upsampling.
For example, a naive upscaling of a native 480p to 1080p is going to have massive artifacts (jagged lines, etc.). Anti-aliasing is one attempt to draw smooth edges but it has problems with transparency. [wikipedia.org]
With DLSS you render a high resolution "ground truth" or "reference image" say at 16K resolution. Then you compare how your native 480p upscaled to 1080p looks against the "reference" image. You feed these back into a Machine Learning algorithm to improve the quality. This is the learning part.
Then when gamers run their game at low resolution you can basically render at extremely low resolutions [youtube.com], apply DLSS 2.x and get decent quality upscaled results at fantastic performance since you are rendering at a much lower resolution.
Obviously there are tradeoffs. Objects get fuzzy and distant objects don't have spatial information so you get shimmering and fizzing.
While Raytracing is important, IMHO denoisers and smart upscalers will be remain important for decades -- especially with decreasing returns of 3840p+ resolutions when you natively render at 1080p and DLSS upscale to 4K+.
Re:AMD Laptop GPUs? (Score:3)
Right now if you have an APU or use integrated graphics you're largely limited to 720p unless you want a slide show. DLSS offers the ability to make 1080p gaming at reasonable FPS an actual possibility without having to get a dedicated GPU.
Additionally the dedicated hardware for ray tracing isn't particularly useful for anything that isn't ray-tracing or can't be mapped onto it. DLSS on the other hand requires hardware that makes ML algorithms run faster and that's a lot easier to generalize. If it comes down to where to spend the transistor budget, I think the hardware accelerators for something like DLSS are far more valuable than those that make ray tracing feasible.
Re:AMD Laptop GPUs? (Score:2)
There's many ways to skin a cat, you can have a classifier to determine blending parameters based on something more efficient such as TSVQ.
Re:AMD Laptop GPUs? (Score:2)
DLSS on the other hand requires hardware that makes ML algorithms run faster and that's a lot easier to generalize. If it comes down to where to spend the transistor budget, I think the hardware accelerators for something like DLSS are far more valuable than those that make ray tracing feasible.
These days there is no transistor budget. They're effectively unlimited. In laptops and small form factor PCs the limiting factor is now Thermal Design Power. It's easy (for very expensive values of fab 'easy') to pack more than enough transistors in to overwhelm any compact cooling system.
Nowadays the conversation is where to spend the watts. If you can dream it up, the transistors are there. Keeping them from melting is the hard part.
Re:AMD Laptop GPUs? (Score:1)
AMD has ray tracing in the desktop GPUs now. They are still behind in performance, but AMD has got a foot in the door.
Re:AMD Laptop GPUs? (Score:2)
Re:AMD Laptop GPUs? (Score:2)
nVidia's "Ray Tracing" is kind of a half baked, anyway. Yes, it traces rays, but it bothers me that they call it ray tracing because it isn't traditional ray tracing. It's like calling socialism communism when communism is a subset of socialism and the reverse is not true - also socialism's definition is nebulous as is ray tracing.
A "server" CPU (Score:2)
This is clearly targeted at HPC, not for the web and business logic ... saying this is for servers while strictly true is misleading.
Just say it (Score:0)
Let's be even more specific, this is clearly intended for bitcoin mining.
Re:Just say it (Score:2)
Ethereum miners don't need massive interprocessor bandwidth.
This is clearly not intended for the last few months of Ethereum mining.
Title should say "design" instead of "make" (Score:5, Interesting)
You need to own chip fabs to actually "make" CPUs. Intel has fabs, and is building more. Nvidia begs for time on 3rd party fabs to get their stuff made.
Re:Title should say "design" instead of "make" (Score:2)
I wouldn't characterize paying for a service from which there are multiple suppliers as begging. I am sure Intel has third party component and machine suppliers they don't make everything from dirt do they?
Re:Title should say "design" instead of "make" (Score:2)
If it is critical to your core business then you are pretty much caught begging. One layer of abstraction should be in-house; additional layers can reasonably be outsourced if that is the only viable solution.
Re:Title should say "design" instead of "make" (Score:2)
Multiple suppliers? You mean 2, one of whom (Samsung) may or may not not be especially amenable to making a possible future competitor's chips?
TSMC and Samsung can't make enough chips right now. There's no clear sign this will get better anytime soon. Might be a good time to have your own fabs, even if you're a generation behind.
Re:Title should say "design" instead of "make" (Score:2)
They pay for it. I don't beg the supermarket for food, I don't beg my landlord for an apartment, I don't beg my ISP for Internet.
Re:Title should say "design" instead of "make" (Score:2)
They pay for it. I don't beg the supermarket for food, I don't beg my landlord for an apartment, I don't beg my ISP for Internet.
You must not have Comcast...
Re:Title should say "design" instead of "make" (Score:2)
They pay for it. I don't beg the supermarket for food, I don't beg my landlord for an apartment, I don't beg my ISP for Internet.
There's no supply shortage of the things you mentioned, like there is for chips. Actually, if you pay rent in us dollars, it's probably about to go up due to a huge increase in the money supply and corresponding real estate inflation. For more info, you can search for "chip shortage." You can also search for an Nvidia GPU, but you won't find one for sale at retail.
Nvidia going Google... (Score:2)
...Remember when Google was just a search engine, and Nvidia just made graphics cards?
This happens to all companies that receives overwhelming success, such as Google when they literally branched out into everything, even self driving cars.
Nvidia is experiencing a sales boom not even they had expected, 3x the demands - this gives them an unprecedented opportunity to invest in everything. And since they're essentially already producing GPU's (which essentially is a form of CPU, just with more specialized instructions), there's nothing stopping them from going down that route.
Re:Nvidia going Google... (Score:3)
They'll keep expanding until eventually they handle email.
Re:Nvidia going Google... (Score:3)
Re:Nvidia going Google... (Score:2)
> and Nvidia just made graphics cards?
Uhm, nForce [wikipedia.org] (2001) and Tegra [wikipedia.org] (SoC 2008) ?
Re:Nvidia going Google... (Score:1)
VooDoo3 (3dfx) in 1999. because I bought one back then. GREAT graphics card.
Nvidia has no fabs. (Score:2)
Samsung, TSMC, and Intel are building fabs in the US. Only one of these three is an American company.
If you're the military, who gets the contract?
Re:Nvidia has no fabs. (Score:2)
Well, the two non-US companies are still from countries which have historically been strong allies with the US.
Re:Nvidia has no fabs. (Score:3)
The question is, do we fight World War 3 when China finally makes their move for Taiwan. They only have one aircraft carrier but it's doing "training" operations there and they have planes in Taiwanese airspace.
Fun fact: Mao backed off Taiwan because we threatened nukes. Some how I don't think it would play out that way this time.
Comment removed (Score:2, Insightful)
Re:Nvidia has no fabs. (Score:2)
Oh not just that. There's nothing preventing Taiwan from doing the same thing Saddam did when pulling out of Kuwait.
Re:Nvidia has no fabs. (Score:2)
20 years ago the same thing was said about marginalizing Hong Kong.
Re:Nvidia has no fabs. (Score:2)
Re:Nvidia has no fabs. (Score:3)
The US military is probably the last large entity on the planet that would migrate to a new CPU architecture. They're not going to overcome their momentum for decades, even if this new chip is utterly fantastic.
They'd be looking at AWS, Google and Azure long before the US Military.
Comment removed (Score:2)
I'll pass. Thanks (Score:1)
Idiots (Score:2, Interesting)
Why didnâ(TM)t they go with RISC-V? Almost wishing the UK blocks their purchase of ARM.
Re:Idiots (Score:4, Informative)
RISC-V has an interesting future, but we are still years away from a package to compete with other high-end CPUs and the tooling surrounding it.
Re:Idiots (Score:2)
Fair enough .. but that's where we should encourage nVidia to help.
Re:Idiots (Score:2)
Nvidia is a for-profit company. Right now GPUs and ARM CPUs are the biggest sellers so that's what they're going to make.
You may want to ask smaller, low-volume entities for RISC-V hardware, such as the Raspberry Pi Fondation. Keep in mind that even their newly-launched 4$USD Raspberry Pi Pico uses a dual-core ARM CPU.
Re:Idiots (Score:2)
Yes they are a for-profit company .. that's why I am saying shareholders and people making purchase decisions should reward them for short term gains.
Re:Idiots (Score:2)
*should not reward them (not should)
Re:Idiots (Score:2)
It's spelled fondation in french, my mistake. Spellcheck has a hell of a time trying to auto-correct and point out mistakes with bilingual people.
ARM vs. X86 (Score:3)
These appear to be ARM chips, not x86. It's not clear if ARM is good for higher-end servers. x86 has a head-start in server-oriented features and instructions such that ARM may not be able to catch up any time soon. Investors don't like long-term bets and may be disappointed.
Re:ARM vs. X86 (Score:3)
not x86. It's not clear if ARM is good for higher-end servers
That depends on the server. Are you after performance at all costs, or performance per watt? ARM servers already exist for this reason. I wonder if NVIDIA is taking on Intel or if they are taking on the Marvell ThunderX2. https://www.gigabyte.com/Enter... [gigabyte.com]
Re:ARM vs. X86 (Score:3)
The x86 CPU is a Joe of all trade. It can do anything but it can't excel at any of it.
The ARM + GPU combo can do anything better/faster at lower power requirements than x86.
Re:ARM vs. X86 (Score:3)
Technically, ARM hasn't proven to be better at performance or performance per watt at high end.
Most of what ARM got right before Intel tried to target the market was sophisticated sleep states. Intel's sleep and idle states were coarse and not useful for an always-connected, mostly 'sleeping' platform and a good battery life was simply not in the cards. Intel has at least gotten their platform better at this now, though too late to shift the mobile market since ARM went first.
The other thing ARM did was go lower end than Intel was willing. Intel wouldn't make low-cost, low-performance chips, and those were just what the mobile market needed.
Now make an ARM processor at 100W TDP and will it outperform Intel or AMD? Probably not, not because the architecture is incapable, but because of the software ecosystem that needs to catch up. It can happen, but it's not as simple as 'ARM is better'.
Re:ARM vs. X86 (Score:2)
What an ignorant comment. Don't be a Moran.
Re:ARM vs. X86 (Score:2)
I'm trying not to don't be one!
If they could now.... (Score:1)
NVIDIA has no choice, but to make CPU (Score:2)
ARM CPUs and ARM PCs (Score:3)
are going to become a standard and eat Intel's and AMD's lunch. Apple already switched to ARM and soon a bigger computer will be based on the Raspberry PI with a faster ARM chip.
I wanted to buy an ARMiga which is an ARM-based Amiga that emulates old Amigas through software.
Explaining CPUs to Slashdot. (Score:2)
Are you fucjing serious??
Start being editors, you lazy fucks!
FIRST (Score:1)