AMD Confirms Commitment To x86 163
MrSeb writes with an excerpt from an Extreme Tech story on the recent wild speculation about AMD abandoning x86: "Recent subpar CPU launches and product cancellations have left AMD in an ugly position, but reports that the company is preparing to jettison its x86 business are greatly exaggerated and wildly off base. Yesterday, Mercury News ran a report on AMD's struggles to reinvent itself and included this quote from company spokesperson Mike Silverman: 'We're at an inflection point. We will all need to let go of the old 'AMD versus Intel' mind-set, because it won't be about that anymore.' When we contacted Silverman, he confirmed that the original statement has been taken somewhat out of context and provided additional clarification. 'AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.' The larger truth behind Silverman's statement is that no matter what AMD does, it's not going to be 'AMD versus Intel' anymore — it's going to be AMD vs. Qualcomm, TI, Nvidia, and Intel."
Considering Bulldozer ... (Score:5, Interesting)
The larger truth behind Silverman's statement is that no matter what AMD does, it's not going to be 'AMD versus Intel' anymore — it's going to be AMD vs. Qualcomm, TI, Nvidia, and Intel."
Considering the execution of Bulldozer, you could possibly add AMD to the vs. list.
Re: (Score:3)
Granted, Bulldozer is...painful to look at. However, I am willing to give AMD the benefit of the doubt, and allow them one upgrade cycle to fix the bugs in their design before considering the competition. They claim that this design will ramp up better than the previous stuff, and others have claimed that a few software patches are needed for various OSs like Windows to take advantage of the change in architecture.
Mind you, it does kind of feel like Intel with the Itanium (the Itanic), but thankfully this d
Re:Considering Bulldozer ... (Score:4, Insightful)
The problems with Bulldozer are more than can be fixed by a few revisions or software patches I'm afraid.
I honestly can't figure out what AMD was thinking. Every demanding desktop task these days makes heavy use of floating point maths. In fact that has been the case for a decade or more, going back to P4 and Athlon eras where they were adding more FPUs to single cores. So what does AMD do? Let's have more cores and fewer FPUs!
I can only assume they were hoping that more of the heavy floating point computation would be handled by the GPU. Meanwhile Intel's current generation have added new instructions that outperform GPUs in tasks like video transcoding. It breaks my heart because I was really looking forward to Bulldozer as I have always favoured AMD. Their sockets last much longer than Intel's who seem to dream up a new one for every CPU revision, and you get all the features that Intel charges extra for like ECC RAM support.
I think the best thing they can do now is revise the design and release the next generation as early as possible because this one is going nowhere.
Re: (Score:2)
I honestly can't figure out what AMD was thinking. Every demanding desktop task these days makes heavy use of floating point maths. In fact that has been the case for a decade or more, going back to P4 and Athlon eras where they were adding more FPUs to single cores. So what does AMD do? Let's have more cores and fewer FPUs!
I thought that the standard 128 bit FPUs were independent between the modules as before. The only sharing that happens is when an AVX instruction is issued and they get merged to be a si
Re: (Score:2)
Re: (Score:2)
Unfortunately the 2nd gen Bulldozer chips promise no more performance improvements than Ivy Bridge does. Intel's internal documents are leaking [hwbot.org], expect 10-15% performance gain, 20% lower power consumption (95W -> 77W) and HD4000 will be about 50% faster than HD3000 which will take another chunk of the discrete graphics market. Their 22nm 3D transistors are a real kick in the nuts for AMD, looks like the Core equivalent of die shrinks. Not exactly the competition they needed right now.
Corporate double-speak (Score:4, Insightful)
AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud.
This is a completely meaningless statement. You could say the same exact thing about any microprocessor company. For example:
"Freescale is a leader in PowerPC microprocessor design, and we remain committed to the PowerPC market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."
This statement is true even though AMD and Freescale aren't competitors.
This is the kind of garbage that makes employees think that their managers are clueless and don't know how to fix the company.
Re: (Score:2, Funny)
Re:Corporate double-speak (Score:4, Funny)
"PRUNEJUICE is a leader in DRINK YO PRUNE JUICE design, and we remain committed to the DRINK YO PRUNE JUICE market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."
A warrior's drink.
Re:Corporate double-speak (Score:5, Insightful)
Meaningless marketing spin are the only public statements that:
A. don't cause controversy and anger among the investors/stockholders
B. you aren't forced to go back on 12 months down the line when you find out you were too optimistic and/or out of touch.
Re: (Score:3, Insightful)
Meaningless marketing spin should cause controversy and anger among the stockholders. If I'm investing my money in a company, I want to know they have real plans, not just platitudes. Buzzwords are a sign that they have no idea what they're doing. Take your money and run.
Re: (Score:2)
that's what board meetings are for, not conversations with the press.
Re: (Score:2)
Stores have figured out how to dump the unprofitable customers (long live data mining). Now they've figured out how to dump the skittish investors. You weren't wanted in the first place. Actually, the game was played this way all along.
Reason is a short leash. The receiving side will take blind faith any time they can get it.
Re: (Score:2)
I'd look for firm dates and roadmaps provided to customers, partners, and investor relations as a sign that the company knows what they're doing.
Re:Corporate double-speak (Score:5, Funny)
This is a completely meaningless statement. You could say the same exact thing about any microprocessor company. For example:
"Freescale is a leader in PowerPC microprocessor design, and we remain committed to the PowerPC market. Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."
This statement is true even though AMD and Freescale aren't competitors.
Freescale commits to the cloud? That is BIG news. Time to run out and adjust my stock portfolio!
Re: (Score:3)
Do their chisps [still] overheat? (Score:2)
In the past, I always advocated for, and employed AMD chipped systems. I was once burned by my advocacy when I lost several AMD mobos after they all got fried!
This was a contributory event to my getting fired, though a poorly written application was partly responsible. My employer could not listen because other AMD systems survived. They did because they were to be running the application next.
What is the experience of slashdotters using these systems? Do they still consume lots of power or overheat?
Re: (Score:3)
AMD chips seem to consume more power than comparable Intel chips, but I'm pretty sure they have thermal throttling these days.
I was impressed with one of the old P4 systems in my previous job because the fan was just lying on top of the heat-sink and every once in a while someone would knock it off and the CPU would just throttle down until someone got around to putting it back (yeah, I don't know why we never spent a few dollars to buy a fan that could actually be screwed into place). In those days an AMD
Re: (Score:2)
I remember reading an old Tom's Hardware article dating back to the Athlon XP days about this.
Re: (Score:2)
As someone who bought an Athlon XP during that era, and in fact did exactly that (Fan failed, possibly due to dust, possibly just bad bearings)
Idiot. The Atlon XP had thermal throttling. The original (600MHz) Athlons didn't, and neither did the Thunderbirds IIRC, but the XP certainly does have it.
I write my own code so I know how to get the best out of the CPUs I buy. The Bulldozer looks like a very interesting design with a great deal of potential. I waited until the Phenom II came out to buy one and I'
Re: (Score:2)
Re: (Score:2)
It probably has, Cool 'n' Quiet has been around for nearly a decade, so I'd bet that would be about the same time that they added the temp sensors and throttling. IIRC that was about the same time that Intel introduced similar technology.
Re: (Score:3)
Yes you do have to do your research and make sure you purchase a compatible motherboard, that is true whether you are dealing with Intel, AMD, or VIA.
125W is still the highest of the desktop chips from AMD, but many AMD motherboards are rated at 140W. You can buy a motherboard that only supports up to 95W, but why would you do that?
Intel has some 130W chips.
What's he gonna say? (Score:5, Insightful)
Let's say AMD is planning - or thinking about, at least - stopping the manufacture of x86 processors. What's a responsible company spokesperson going to say? "Yes, we're working on an exit strategy and are hoping to be out of the business by 2014" - does anyone believe that would be stated? If it was, their x86 business would tank immediately, and all employees working on x86 now would update their resumes and get while the getting is good.
Several years ago, we had an important faculty member accept a dean-ship at another university. The lead time was going to be a bit more than a year. In the meantime, this faculty member still had research projects going full bore. So what did he do? He told his staff that the research projects were going to continue, and would remain at our university for the foreseeable future. Guess what happened a year later? Yup - the "foreseeable future" he spoke of 12 months before turned out to be almost exactly 12 months long.
Re: (Score:3)
Or it could just as easily be someone floating a balloon -> a rumor is reported through various sources, and AMD gets a preview of how the market might react. Depending on the reaction, they might go one way or the other.
Re:What's he gonna say? (Score:5, Interesting)
I'd be willing to bet that one of AMD's investors is Intel, and while AMD may want to get rid of the x86 business, Intel won't let it.
Intel needs AMD. And AMD's weakened state is ideal for Intel. However, if AMD dies, Intel also suffers (think anti-trust). But with AMD alive, Intel's scrutiny is lowered and they can sell more chips easily.
Heck, I'm willing to bet Intel has next-gen chips ready, but they want to keep AMD viable and are holding off the release. There's no benefit to Intel other than a few percent marketshare if AMD dies, and there's a huge downside of EU regulators, US regulators and very close scrutiny.
Re: (Score:2)
AMD already is using next-gen chips. They would love for x86 compatibility to no longer be a line item for those chips, because it represents significant wasted silicon. I don't think we're there yet though, and I don't think AMD thinks so either. Only when Windows XP is gone, and the machines that run it along with it, will we truly be ready to move Windows to the 64 bit era. From what I can tell, most of the machines that have come with it are 64 bit anyway, whether they came with 64 bit windows or not, b
Translation (Score:5, Interesting)
"Our strategy is to accelerate our growth by taking advantage of our design capabilities to deliver a breadth of products that best align with broader industry shifts toward low power, emerging markets and the cloud."
We will continue to make chips for servers, and low end crap. We can't compete with Intel for the consumer market in the short to medium term, however we are still relevant in business circles.
Consumers prepared to be gouged by Intel as soon as they figure this out. Also other than to just "say it" this has been the truth for some time, years in fact. I don't know if it is AMD stumbling or Intel just continuing to hit home runs, but there hasn't exactly been a whole lot of competition since the days of the ye old Athlon 64 series of processors. Ever since Intel came out with the Core 2 Duo, AMD has been unable to come up with an answer. Perhaps it had something to do with diversifying by buying up ATI, diverting capitol or focus away from core business. Ironically the AMD/ATI brand of video cards has a better reputation than the AMD CPU division, if only my opinion...
Re: (Score:3)
We will continue to make chips for servers, and low end crap. We can't compete with Intel for the consumer market in the short to medium term, however we are still relevant in business circles.
Consumers prepared to be gouged by Intel as soon as they figure this out.
Intel really can't gouge customers too hard, or it will hasten the transition away from x86 that they fear. ARM will be a much more serious competitor once Windows 8 is released with support for it. Yes, it requires everything to be recompile
Re: (Score:2)
Indeed. Intel's real competition hasn't been AMD for a few years now, it's been ARM.
Re: (Score:2)
Let Intel turn its full attention to ARM for a few cycles, and see if AMD doesn't punish them.
Re: (Score:2)
Let Intel turn its full attention to ARM for a few cycles, and see if AMD doesn't punish them.
Intel don't need to, because they're big enough to have different teams doing both. The problem is that no-one can really push the x86 architecture down to ARM-level power consumption because it's such a complex beast in comparison.
Re: (Score:2)
I wouldn't be surprised if Intel is REALLY regretting selling off XScale to Marvell - Intel had an ARM business for a while, but it just didn't do particularly well, so they sold it.
Probably 1-2 years later, the ARM market started exploding.
I would not be surprised if Intel is quietly working on getting back into the ARM business.
Because AMD still resembles a threat. (Score:2)
Intel is pushing forward because it's beneficial to them at the moment not to rest on their laurels.
AMD is underperforming, yes, but not so much that Intel is given any real leeway to slack off;
That is to say, if the i5/i7 lines were only a 5% increase over C2D performance for 1/3 higher price, AMD would have destroyed them, so while AMD hasn't been "real" competition for Intel for quite some time now, they've been good enough to keep the industry trudging along.
If AMD outright left the market, there would
Re: (Score:2)
Re: (Score:2)
This is pretty much what happened in the P4 days. Intel got complacent and started gouging customers, and that allowed AMD to gain HUGE amounts of market share.
Re: (Score:2, Informative)
Re: (Score:2)
Yeah "gouge" might be too strong a word. Elevated prices due to limited competition is likely a better way to put it. At best when AMD was even at its height and enthusiasts argued hotly which was better, AMD only had a marginal market share, mostly due to the big box stores such as Dell, Gateway (remember them, whatever happened to them), and the rest being reluctant to move away from Intel (I also recall some shady trade practices by Intel at the time also).
In any case, even though limited, the competitio
Re: (Score:2)
ARM only will compete against Intel in cases where power consumption is more important than performance, i.e. netbooks and low power servers (read small market).
Or an increasing number of desktop/laptop uses. At least 90% of the time my laptop and desktop systems are clocked down to the minimum clock speed because there's really nothing for the CPU to do when browsing the web or writing documents.
An awful lot of current x86 CPUs could be replaced with ARM and users would barely notice.
Re: (Score:3)
They would notice once they tried to play any games or run any heavy apps like photoshop.
Which most people don't do. The most processor-intensive thing a large fraction of PC users do on a typical day is play HD video on Youtube.
Re: (Score:2)
ARM only will compete against Intel in cases where power consumption is more important than performance
And in places where ARM's performance is 'good enough'. I have a little machine with an 800MHz ARM Cortex A8. For light web browsing and word processing, it's just about good enough, but for anything heavier it isn't. I have another machine with a dual-core 1.5GHz Snapdragon (heavily tweaked A8), and it's fine for Flash-heavy web browsing and most other things including playing back streaming video. A quad-core 2GHz Cortex A9 is far more power than a large proportion of computer users currently need.
Radeon may save them... (Score:2)
My understanding is that Radeon cards are still competing neck-and-neck with Nvidia's offerings these days, especially per-dollar. I may be mistaken, though, as my video card is still an 18-month-old ATI Radeon 5850 (back before Nvidia even had a DirectX 11 card on the market, and before the AMD-ATI buyout), which can still play everything I've thrown at it on full settings at 1920x1080.
Even if their CPUs are lack-luster (even at the lower price point, it would seem, where they used to be quite competitive)
Re: (Score:2)
Re: (Score:3, Interesting)
Linux has had better support for ATI than Nvidia cards for at least a generation now.
Re:Radeon may save them... (Score:5, Insightful)
Re: (Score:3)
Add to that Nvidias clearly superior support for hardware accelerated HD decoding and really, my favorite card has some catching up to do. I spent months trying to get a Radeon card to work in my HTPC and I think I got the hardware decoding to
Re: (Score:2)
Re: (Score:2)
Yup, I have yet to ever see evidence that ATI has learned the concept of regression testing.
It seems like on a regular basis, Game X needs driver revision M or lower, and Game Y needs driver revision N or higher with ATI cards. So you're screwed if you want to play both games.
Every time I have had the misfortune of dealing with an ATI video chipset, it's been utter driver hell. NVidia does a much better job of regression testing, and they also do a MUCH better job of long term support of older chipsets.
Re: (Score:2)
Let us also not forget that AMD will not give you mobile drivers, which used to also be true of nVidia, but not any more. Now you can download the Quadro FX drivers (for example) direct from nVidia. But AMD still expects you to get mobile graphics drivers from the OEM. That's a convenient way to avoid supporting older graphics chips like the integrated graphics in my R690M-based "netbook" (at 11" with megachiclets it's more like a subnotebook) which only works properly in Windows Vista 32-bit. There's no Wi
Re: (Score:2)
Hmm. AMD has been trying to topple Intel by merging the CPU and the GPU into a single unit.
Intel tends to be better in the single-threaded CPU performance, while AMD has been better with offering more cores. What changes with ATI and Intel is that Intel's graphics options are something of a terrible joke (played on corporate and value customers), and ATI's video cards are sought after as equally as Nvidias.
If AMD can offer a single chip that does both, and does it well (key factor here), with compilers that
Re:Radeon may save them... (Score:4, Insightful)
If AMD can offer a single chip that does both, and does it well (key factor here)
You can't put a 300W GPU and a 125W CPU on the same die. At least not if you're sane.
The only use for graphics integrated on the CPU are for cheap low-end systems or for extra performance if you can offload some processing to the GPU cores. Putting a high-performance GPU there makes no sense because you need insane cooling to get the heat out and it will be crippled by the slow, shared memory interface anyway.
Plus, of course, you can't just upgrade the GPU in two years when the CPU is still fast enough for current games but the GPU isn't; you have to replace both. CPU manufacturers might love that, but users won't.
Re: (Score:3)
It doesn't help that Intel integrated graphics made GREAT leaps forward in Sandy Bridge.
Re: (Score:2)
The AMD buyout of ATI happened in 2006. Your 5850 was made by AMD.
One thing AMD has over Intel (Score:2)
I just upgraded my PC from a Intel E6600 to a AMD Phenom II X6 1100T. I chose AMD, for one reason. How the heat sink / fan attach to the motherboard.
I have dogs, and kids and my PC doesn't reside protected under a desk. It gets bumped all the time from them playing and those stupid plastic plug brackets that Intel uses to attach the heat sink and fan to the motherboard were absolute garbage. Someone would bump my PC and the heat sink would hang off and cause the CPU to overheat. Not to mention after re
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
No, it hasn't. Intel heatsinks are attached by plastic clips that go through holes in the motherboard. AMD heat sinks are clipped to a bracket that is bolted through the motherboard to a backing plate behind the CPU socket. They use a cam-style locking mechanism to provide tension and keep the heatsink tight while eliminating the prying that was needed on socket 370 and socket A heatsinks.
Translation (Score:3)
When we contacted Silverman, he confirmed that the original statement has been taken somewhat out of context and provided additional clarification. 'AMD is a leader in x86 microprocessor design, and we remain committed to the x86 market. Our strategy is to buzzwords buzzwords buzzwords buzzwords buzzwords buzzwords.'
Dear AMD.... (Score:2)
I love you guys but recently only have been buying Intel i5 and i7 because your Math coprocessor still stinks badly compared to Intel. For video compression and really heavy maths, I really wanted to use your 6 core processors, but they were slower than the 4 core i7 I bought instead.
Give me a 6 core that runs like a raped ape and has a really good math coprocessor and I'll be back. give me an 8 core that can also do multi chip on the same motherboard so I can build a 16 core for a cheap price, and I'll
Re:Dear AMD.... (Score:5, Informative)
There are lots of us that actually do real computing that has really heavy math.
So shouldn't you be using SSE instructions rather than x87?
Re: (Score:3)
Thank You.
And I think VirtualDub is compiled to take advantage of the AMD64 architecture.
Shame... (Score:4, Insightful)
Shame- I usually support the underdog- and always wanted AMD to be able to run Intel neck-and-neck.
Nowadays though AMD seems to stand for A Mediocre Design
I hope they can recapture their mojo and challenge intel again- if for no other reason than to provide a lower pricing incentive to intel.
Re: (Score:2)
I think they're still going to be very popular for servers and supercomputers for a long time.
There's also the problem that (Score:2)
They don't beat the comparative Intel chips at said tasks anywhere near well enough to justify the heat and cost tradeoffs.
Re: (Score:2)
I support the underdog too!
I can remember the days when Intel had the whole market to themselves, the 486 CPU was over $1,000, worth more than its weight in gold. Then came along Cyrix who started making a cheaper alternative, the price soon dropped to less than $200 per CPU. The manufacturers were still making a profit and the consumers better off.
As long as the performance is not too far off the mark I will continue to buy AMD.
Offer it to the Chinese (Score:2)
China needs a processor company, and even without AMD being leading-edge if their products are sold inexpensively there is a huge potential market worldwide.
Re: (Score:2)
Re: (Score:2)
The Bulldozer release showed AMD's commitment to low-end computers.
In what way? The Bulldozer architecture is transistor-heavy and uses lots of power, just the opposite of what you want in a low-end computer.
If anything shows AMD's commitment to low-end x86 computers, it's Bobcat.
Re:Am I the onlyone... (Score:5, Informative)
Re: (Score:2)
Re: (Score:2, Interesting)
He just read, "The Innovator's Dilemma," by Clayton Christensen. He sees new innovation in x86 chipmaking as having diminishing returns, making the entire architecture susceptible to other architectures and competitors where new innovation still provides increasing returns.
Re: (Score:2)
Re: (Score:3)
Precisely. It meant "the point where AMD goes from a desktop chip maker that also makes mobile chips, to a mobile chip maker that also makes desktop chips".
Re:Am I the onlyone... (Score:5, Insightful)
that had to google "inflection point"? From a marketing standpoint it might be good to have a CEO who isn't an engineer :P.
or a CEO who picks up a word or phrase from an engineer and thinks, 'Hey, that sounds good, I'll use it in my next meeting or press statement!'
Re: (Score:3, Insightful)
Indeed. Which is why words and phrases like "pushing the envelope" and "quantum leap" are so often used wrong, and marks the CEO (who reflects on the company) as a dummy.
Re: (Score:2)
>are so often used wrong
Oh the irony. :)
(grammar: "... are so often used incorrectly, ...")
Re: (Score:2)
"Wrong" is an adverb [merriam-webster.com]. Coincidentally, M-W uses the word "incorrectly" as a synonym.
Re: (Score:2)
I know:
http://slashdot.org/comments.pl?sid=2543280&cid=38161632 [slashdot.org]
Re: (Score:3)
This is why it's good to have a background in math, even if you're not employed in an STEM field. All sorts of processes can be described in mathematical terms, knowing what those terms mean helps you understand the world better. People often say "calculus? I'll never use that after high school!". But the truth is, I use my calculus education every single day without ever touching an integral or derivative.
Re: (Score:2)
This is why it's good to have a background in math, even if you're not employed in an STEM field. All sorts of processes can be described in mathematical terms, knowing what those terms mean helps you understand the world better. People often say "calculus? I'll never use that after high school!". But the truth is, I use my calculus education every single day without ever touching an integral or derivative.
Why baffle people with BS when you can use real language :)
The new line will factor into integral processes where derivatives will end-product quality!
Sounds like something I'd read in a Dilbert strip...
Re: (Score:2)
Marketroids talk about graphs and curves all the time. They will have cost vs. performance, cost vs. time.
Imagine a basic bell curve. You start at zero, which is no sales, as there has been a pre-release advertising campaign and the products have been reviewed before going on the shelves. Then they start to sell. The inflection point is where word of mouth has gone out about a product, and market demand is highest.
After competitors start bringing out equivalent products, the demand starts to slow down. The
Re:Am I missing something here? (Score:5, Informative)
Re: (Score:2)
Every x86 CPU made today, whether from Intel, AMD, or even Via, supports the AMD64 extensions.
Some of the netbook Atoms didn't a year or two back; isn't that still the case today?
Re: (Score:2)
Some of the netbook Atoms didn't a year or two back; isn't that still the case today?
As far as I can tell, the N270 (Diamondville series) was the last Atom that didn't support 64-bit. A quick Google search indicates that Intel hasn't officially discontinued it, but it seems to be almost impossible to find any new products that contain one. Newegg doesn't have any netbooks using this, though they do sell a 10-pack of Intel Atom N270 motherboards [newegg.com]. Since they don't sell individual units, I assume these may b
Re: (Score:2)
Re: (Score:3)
x64 is an extension to x86. What we need a a whole new class of computers designed and built for 64bit architectures. But that calls for a complete redesign of the most popular OS and probably MOBO architecture as well.
The problem is, who would want to do that?
What about ARM64? (Score:2)
It will be deployed in 2014:
http://www.eetimes.com/electronics-news/4230160/ARM-unveils-64-bit-architecture [eetimes.com]
"Indeed, the first processors based on ARMv8 will only be announced sometime in 2012, with actual server prototypes running on the new architecture expected in 2014."
Re: (Score:2)
Re: (Score:2)
We could just breathe life back into the Alpha architecture.
As a matter of fact, I believe that MS supports it right up until Windows 2000 (multiple RCs, no release).
I'd love to have an EV12 processor in my next machine. ^_^
Re: (Score:2)
x64 is an extension to x86. What we need a a whole new class of computers designed and built for 64bit architectures.
How would "[designing] and [building]" a computer "for 64bit architectures" differ from what's being done now?
But that calls for a complete redesign of the most popular OS
I know of no desktop, notebook, or server OSes that would need "a complete redesign" to work on 64-bit architectures - they already work on them; presumably whatever would make the "whole new class of computers" different from the 64-bit computers being sold now is what would require that "complete redesign".
Re: (Score:2)
"x86" in this context means desktop x86 chips, x86_64 chips and AMD64 chips.
Re: (Score:2)
It sounds like everything you do is x86. The alternative to that architecture isn't x86-64, it's Itanium-64 or ARM or any of the various big iron RISC chips.
Re: (Score:2)
Re: (Score:2)
x86 is a subset of x86_64
No, it is not. The registers aren't even the same. You can run 32-bit code on it by flipping into the right mode, but that doesn't make it a subset of 64-bit mode.
Re:x86 (Score:5, Insightful)
When the Pentium 4 came out, it was frequently called the "7th generation", but it was never called the 786 or 80786, either formally or informally
But they are all x86 compatible, because they can all run code compiled for 8086, 80186, 80286, 80386 and 486 processors.
My new hobby will be referring to processors as having x87 architecture, as a distinction to indicate they support floating point instructions.
People do refer to x87 when talking about the FPU on x86 chips. It's commonly used when differentiating it from SSE - modern compilers will emit SSE instructions instead of x87 ones unless you specify a backwards compatible target architecture (PII or earlier).
Re: (Score:2)
Because the processor (theoretically) supports the x86 instruction set?
Re: (Score:2)
I thought the last x86 processor produced was the Pentium Pro
Assuming I understand your pseudo-purist definition of "x86" correctly, one minor pedantry...
The Pentium Pro wasn't the *last* of Intel's "real" x86 processors, it was the *first* of the RISC-with-x86-wrapper (*) designs that make up all chips today. AFAIK the original Pentium line was the last.
(*) Some have claimed that the core isn't actually that RISC-like; the point here is that it's not native x86.
Re: (Score:2)
The Pentium Pro wasn't the *last* of Intel's "real" x86 processors, it was the *first* of the RISC-with-x86-wrapper (*) designs that make up all chips today. AFAIK the original Pentium line was the last.
The poster to whom you're responding said
in which case the last "x86 processor" was the 80486 - the Pentium wasn't sold as the 80586, it was sold as the Pentium.
And, given that the internal microops in Pentium Pro and later are not exposed to prog
Re: (Score:2)
Re: (Score:2)
AMD's been more or less out before. I've got an AMD chip in my laptop and it's really nice. I get good battery life and the performance is good. Plus, I was able to buy the entire thing for a fraction of what an Intel rig of similar power would cost.
That being said, it doesn't look good and they're going to have to kick R&D up a few notches if they want to earn business other than the not Intel crowd.