Forgot your password?
typodupeerror
AMD Hardware

64-bit x86 Computing Reaches 10th Anniversary 332

Posted by Unknown Lamer
from the long-live-athlonmp dept.
illiteratehack writes "10 years ago AMD released its first Opteron processor, the first 64-bit x86 processor. The firm's 64-bit 'extensions' allowed the chip to run existing 32-bit x86 code in a bid to avoid the problems faced by Intel's Itanium processor. However AMD suffered from a lack of native 64-bit software support, with Microsoft's Windows XP 64-bit edition severely hampering its adoption in the workstation market." But it worked out in the end.
This discussion has been archived. No new comments can be posted.

64-bit x86 Computing Reaches 10th Anniversary

Comments Filter:
  • by Grashnak (1003791) on Monday April 22, 2013 @06:56PM (#43520313)

    My 32 GB of RAM, absolutely essential for my work, laughs at your "memory management" bullshit.

  • by LocalH (28506) on Monday April 22, 2013 @06:59PM (#43520341) Homepage

    Those were x86-based? The title was "64-bit x86 Computing Reaches 10th Anniversary", not "64-bit Computing Reaches 10th Anniversary".

  • "worked out" (Score:2, Insightful)

    by girlintraining (1395911) on Monday April 22, 2013 @07:01PM (#43520353)

    But it worked out in the end.

    Yes, mostly due to the fact that we needed a way to get past the 4GB memory limitation, and not because we gave a damn about whether the processor was native x64 or not. AMD has had some great ideas, but they've almost always shorted themselves on the implimentation, leaving the field wide open for Intel to come in with a better offering and take the lion's share of the profit.

  • When AMD gave a presentation to my processor design course (not coincidentally about 10 years ago) one of the presenters said that one of the most surprising speed-ups for 64-bit code came from just having 16 real general purpose registers to work with. Even though register renaming lets you smooth over them, it meant all those extra load and store ops (that RR would identify as waste and work around) now didn't need to be in the code at all. It turned out to be rather non-trivial for one of their test apps.

    So those 32 extra bits of memory addressing are nice. But don't forget about that 1 extra bit for identifying registers!

  • Re:"worked out" (Score:5, Insightful)

    by Dawn Keyhotie (3145) on Monday April 22, 2013 @11:11PM (#43521817)

    WRONG on many levels. Yes, we had to get past the 4GB memory limitation, but there had been, and still were at the time, several other true 64-bit microprocessors around when AMD introduced the Opteron: Alpha, UltraSPARC, MIPS, PowerPC, and yes even IA-64. (not to mention IBM POWER and zSeries.) But they all had the fatal flaw of NOT being compatible with the Intel 32-bit x86 processors and off-the-shelf Windows software. Only Opteron had that, and that compatibility was so critical that Intel was grudgingly forced to adopt the x86-64 instruction set.

    So, you may say, why didn't AMD take the IT world by storm? Because of 1) AMD was not Intel, and never could/would be; 2) Intel was paying manufacturers NOT to offer ANY AMD based systems with marketing kickback agreements; 3) Intel would punish any manufacturer who did offer AMD systems with exorbitant price hikes on the Intel parts they did sell; 4) All this was taking place during the Bush years of federal laissez-faire non-enforcement policy, giving Intel free rein on those practices; 5) Prejudice against AMD in the IT industry was widespread, and still is; 6) few people saw or acknowledged the need for a flat 64-bit address space; 7) those that did have the need for 64-bit software were forced to spend exorbitant amounts of money for RISC workstations, which motivated them to look down their nose at commodity PCs, even if they were 64-bit; 7) Chicken-and-Egg syndrome (no volume 64-bit hardware, thus no volume 64-bit software, thus no need for volume 64-bit hardware).

    So AMD did not "short themselves on implementation". Their architecture was state of the art, and kicked both 32-bit Pentium and non-compatible IA-64 in the nuts. They had all of today's advanced hardware features years before Intel: x86-64 architecture; Hyper-transport to replace the front-side bus bottleneck and enable point-to-point CPU links; and on-board memory controllers. AMD was not able to block Intel from poaching their features because of the pre-existing patent cross-licensing agreements. And anti-monopoly enforcement was practically non-existent at the time (and not much better today).

    Of course, not of this is meant to imply that AMD was not partially or even mostly responsible for their troubles. They were (and still are) horrible at executing their own roadmaps. They were (and still are) horrible at marketing to consumers. They were (and still are) horrible at manufacturer relations. They were (and still are) unable to make a sane strategic decision if their life depended on it. They were (and still are) perceived as the el-cheapo Intel-knockoff copycat instead of pioneering leaders in their field.

    So yeah, AMD is a hot mess, but there is plenty of blame to go around.

  • by tlhIngan (30335) <slashdotNO@SPAMworf.net> on Tuesday April 23, 2013 @12:49AM (#43522219)

    AMD may have helped create the x86-64 market, but now it's getting killed by it. soon Intel will be the only major player. ARM market is AMD's only hope.

    Intel won't let AMD die. In fact, AMD is right where Intel wants them to be - big enough to ward off government regulators, small enough to not be a huge pain in the rear. Intel and other large companies are scared of government regulation and monopoly declaration, and we do know that Intel has committed enough sins that if the regulators look hard enough, they can make a case to break up Intel. Including separating the ASIC design and foundry parts (and we know Intel has a LOT of foundry capacity). And I'm sure Intel's shareholders would rather give up some revenue to ward off the much bigger hit that would happen when the government regulators step in.

    It's entirely possible that Intel has a bunch of "AMD rescue" plans - ranging from simple "let's just buy up all of AMD's CPUs and bury them" to more elaborate schemes. Of course, Intel cannot directly fund AMD. Perhaps Intel could give AMD some patents in an emergency.

    Heck, you could argue that Intel told Sony and Microsoft to buy AMD chips - it gives AMD a nice steady income for the next few years. Intel could've used their extensive fab capacity to make custom chips for the consoles (much more easily than AMD can), but you can bet an opportunity like this to help prevent AMD from keeling over was just perfect.

    And no, this isn't unusual in the business world. What you see as competitors can have all sorts of incestuous relationships amongst themselves - it's not unknown to have competitors to buy parts from each other. And you can bet Apple, Google, Microsoft, Samsung and others are far more chummy to each other than patent lawsuits or settlements will imply. There's enough back room deals and arrangements that really hide the interdependence on each other they all have.

  • by zbobet2012 (1025836) on Tuesday April 23, 2013 @03:16AM (#43522675)
    It sounds like you where just talking to a very bad functional programmer. You also have the order completely backwards. ANSI Common Lisp was the first standardized OO language. But more importantly most "OO" concepts come from functional languages to start with.

    Design patterns for the most part are actually adaptations of pre-existing functional concepts. For example Chain of Responsibility is really just a slightly simplified monad (input must equal output). The first Iterator pattern was (map fn list). Flyweight is a simplified form of Memoization.

    Packages and namespaces also first appeared in many functional languages first. Encapsulation vai lexical closures has been around since Scheme was invented in the 70's. Lambda functions? Those little gems, making there way into every OOP language where invented with lisp.

    You have missed the entire point though if you think OOP is about organizing you programs or something. OOP is largely about encapsulating moving parts into logical pieces. Functional code is largely about minimizing or removing "state" (aka moving parts) from your code. E.g. an input to a function should always give the same output. These concepts are not incompatible at all.

It is better to give than to lend, and it costs about the same.

Working...