Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel AMD Hardware

Intel and AMD Form an x86 Ecosystem Advisory Group (phoronix.com) 55

Phoronix's Michael Larabel reports: Intel and AMD have jointly announced the creation of an x86 ecosystem advisory group to bring together the two companies as well as other industry leaders -- both companies and individuals such as Linux creator Linus Torvalds. Intel and AMD are forming this x86 ecosystem advisory group to help foster collaboration and innovations around the x86 (x86_64) ISA. [...] Besides Intel amd AMD, other founding members include Broadcom, Dell, Google, HPE, HP Inc, Lenovo, Microsoft, Oracle, and Red Hat. Here are the "intended outcomes" for the group, as stated in the press release: The intended outcomes include:
- Enhancing customer choice and compatibility across hardware and software, while accelerating their ability to benefit from new, cutting-edge features.
- Simplifying architectural guidelines to enhance software consistency and standardize interfaces across x86 product offerings from Intel and AMD.
- Enabling greater and more efficient integration of new capabilities into operating systems, frameworks and applications.

This discussion has been archived. No new comments can be posted.

Intel and AMD Form an x86 Ecosystem Advisory Group

Comments Filter:
  • Really? (Score:4, Funny)

    by sconeu ( 64226 ) on Tuesday October 15, 2024 @05:13PM (#64867207) Homepage Journal

    One of the rationales is:

    Enhancing customer choice and compatibility across hardware and software

    If this is the case, then why are Microsoft and Oracle involved?

    • by dfghjk ( 711126 )

      and why Linus? What value, beyond political, does he provide?

      • by gweihir ( 88907 )

        Who says this whole thing is not political?

      • Re:Really? (Score:5, Informative)

        by AvitarX ( 172628 ) <me@@@brandywinehundred...org> on Tuesday October 15, 2024 @05:42PM (#64867303) Journal

        I imagine he has some insight into what features would be beneficial to things running well and which would be a waste of chip space.

        He's only one perspective, but definitely one I'd want when planning out new features for a next generation chips.

        • by AmiMoJo ( 196126 )

          They all seem to have bet on AI tech being the next big thing, but it won't be. I wonder how much die space is wasted on AI accelerators that very few people care about.

          • They all seem to have bet on AI tech being the next big thing,

            In a weirdly half arsed way. Somehow even getting decent inference on their GPUs (integrated or discrete) is a huge pain, especially if you want better than a completely generic third party Vulkan backend which hasn't been even looked at by someone who owns the same GPU as you.

            NVidia has bet on it big time. Intel and especially AMD appear to be maximizing the spend while minimizing the effect.

            • by AmiMoJo ( 196126 )

              I read a tweet just now from a woman who signed up to speak at a conference. The organizer took her profile photo that was cut off at chest height and used AI to make it a bit taller. The AI hallucinated a visible bra and plunging neck line for her.

              It's way to early to be deploying this crap for anything that matters.

              • brilliant!

                There's plenty of good uses of it, if one rather unfashionably calls it ML or even worse, machine vision. AMD still suck, and so do Intel.

      • This is a wild ass guess: MS/Linus/etc are involved to get the hardware and software people talking more amongst each other.

        The Win 11 24H2 update affecting Ryzen CPU performance, and it being released after the launch of the first Ryzen 9000 CPUs, is all the evidence required why there is a real value in discussion between parties in the x86 space (both hardware and software).

      • Re:Really? (Score:4, Informative)

        by MachineShedFred ( 621896 ) on Tuesday October 15, 2024 @06:24PM (#64867395) Journal

        Political value is still value. And if he wasn't involved and it was just Microsoft, Oracle, and Red Hat everyone would be screaming conspiracy theories.

        At least we have a reasonable chance of him calling others on their bullshit, and we know he won't just shut up and take it if they're trying some shit. He has no problem with blowing whistles on bullshit (unless it's his own flavor of bullshit, but that's what the other guys are for - to call him on his too).

      • Re:Really? (Score:5, Insightful)

        by arglebargle_xiv ( 2212710 ) on Tuesday October 15, 2024 @06:44PM (#64867457)
        I think that's it, it's a political move. Can you imagine Intel, before things went downhill for them, ever deciding to consult with others before they did something?
        • by tlhIngan ( 30335 )

          I think that's it, it's a political move. Can you imagine Intel, before things went downhill for them, ever deciding to consult with others before they did something?

          There is truth to this. In the days before we all used NT based Windows, the old Windows 3.x used illegal instruction errors to get back into kernel mode, as it was generally one of the fastest exceptions at the time.

          So when Intel was describing their new chips, they asked Microsoft engineers what their desires were, and one asked for even fast

      • To avoid him ranting on the power virus [zdnet.com] of AVX-512. /s

      • Re:Really? (Score:5, Informative)

        by TheRealMindChild ( 743925 ) on Tuesday October 15, 2024 @08:37PM (#64867713) Homepage Journal

        He wrote the entire firmware, "Code Morphing Software", for the Transmeta Crusoe. He knows more about the x86 instruction set and operation more than most people on earth

    • Re: (Score:2, Troll)

      by gweihir ( 88907 )

      Because it is a lie.

    • by Kwirl ( 877607 )
      This is pretty obviously an attempt to stop Apple from just slapping their silicone into everything and making it the 'standard'
      • Apple? You're kidding right? They won't ship their chips in any machine but their own. ARM on the other hand...

        • by Creepy ( 93888 )

          Apple runs Acorn RISC Machine CPUs.. ARM. was once ABC - Acorn Business Computers. Now it is Advanced RISC Machines. Apple has completely adopted ARM, as its licensing fees are easy to absorb, even easier than USB.

  • by Alworx ( 885008 ) on Tuesday October 15, 2024 @05:13PM (#64867209) Homepage

    - weaken ARM

    • Or just survive against arm. The vast vast majority of users want laptops and they don't play games on their computers, at least nothing more complicated than a crossword puzzle.

      And arm based laptops are still kicking Intel's ass on battery life. Once you get to 12 hours there's no practical reason for more battery life for most users but a lot of users like only having to plug their laptop in occasionally and OLED screens guzzle battery life so having a more efficient chipset offsets that and lets you
      • Actually Lunar Lake is pretty good on battery life. X Elite isn't really any better.

      • The vast vast majority of users want laptops and they don't play games on their computers, at least nothing more complicated than a crossword puzzle.

        [citation needed] (sure they are a *vast* majority?)

        Maybe the users you speak of would be better served by tablets?

      • arm based laptops are still kicking Intel's ass on battery life

        So? They aren't kicking AMD's. Intel is well known to be far behind on bringing down TDP for several generations now.

    • by dfghjk ( 711126 )

      it's a conspiracy to compete!

  • The basic x86 instruction set is 46 years old. 8086 was released in 1978.
    • by Cassini2 ( 956052 ) on Tuesday October 15, 2024 @05:37PM (#64867281)

      Many of the instructions were clones of the earlier 8080A [wikipedia.org], and that was one of the decisive reasons for the platforms success too. Specifically, DOS and BASIC predated the x86 as Z80/8080A programs. Other than some strange stuff with the CP and A registers having reversed byte orders, it was really easy to take 8080A code and make it into x86 code. (I made an emulator to do this.)

      The roots of software compatibility run deep. The first successful microprocessor defined the major instruction set for the next 50 years, with many additions along the way.

      • by Cassini2 ( 956052 ) on Tuesday October 15, 2024 @05:44PM (#64867313)

        Correction: The flags and A registers have reversed byte orders.

      • Thanks, yes I’m aware of most that, but always interesting for others to hear. My first computer had a Z80 — in the Sinclair ZX81, and I did assembly language dev on that and 6502 (C64) and later on Amiga and Macintosh. The instruction set and large orthogonal registers of 68000 were brilliant compared to x86 (even now!). I selected the 8086 as a reasonable baseline for the true “start” of x86. I thought if I chose 8080 or even 8008 or 4004 someone would say that is not x86!
      • DOS and BASIC as names are both things of the 1960s. 8080/Z80 had them, but they're not the same programs as their 8086 counterparts. Okay, MS Basic for 8080 resembled that for 8086, but so did MS Basic for 6502.

        Incidentally, there was a MS DOS for Z80 (sorta) created by the author of MS-DOS. It was a port of MS-DOS, was released much later and was called MSX-DOS.

      • by havana9 ( 101033 )
        The success was helped a lot because it was chosen by IBM to build the first PC.
        If the first IBM PC used a Z8000 or a Motorola 68000 the history of microprocessors would have been decidedly different.
        Intel didn't like very much that architecture, and they made the 8086 as a stopgap CPU, while developing the iAPX 432. They tried a lot of time to propose new CPUs ending with the Itanium, and while at it killing the HP and DEC processor line.
    • by dfghjk ( 711126 )

      the basic instruction set that's not used any more? How old is 64 bit x86?

      Also, ARM is 45 years old.

      • Even when executing on latest hardware and compiled with latest compilers in 64-bit, many original 8086 instructions are used, often in their original short form and addressing modes, with identical opcode encoding and behaviour. If you attach a debugger to a random process on a modern x86 machine, and examine the instructions executing on a random thread, there is a good chance that many will appear in the original 8086 manual.
      • Not used anymore? You may still use it if you want. Modern x86 CPUs still support both real mode and 16-bit addressing, and using both means using the original 8086 instruction set.

    • by erice ( 13380 ) on Tuesday October 15, 2024 @05:42PM (#64867301) Homepage

      It took so long for Intel to weaken to the point where it needed to work with others (especially AMD) rather than just forcing everyone else to follow their lead. The AMD64 debacle did not cut deeply enough.

      • It took so long for Intel to weaken to the point where it needed to work with others (especially AMD) rather than just forcing everyone else to follow their lead.

        Except they didn't have to do that, and you alluded to it:

        The AMD64 debacle did not cut deeply enough.

        You mean the debacle in which they had to follow AMD's lead?

        The only time AMD tried to differ and failed was FP on the K6, and we had to have patches for it, and then they fixed it in the K6/2.

    • The basic x86 instruction set is 46 years old. 8086 was released in 1978.

      And it has far outlived its usefulness. How much farther ahead could we have been in computer technology if we'd standardized on a better architecture?

  • by MachineShedFred ( 621896 ) on Tuesday October 15, 2024 @06:17PM (#64867371) Journal

    Would have been nice to have at least some lip service towards secure chip design that doesn't require disabling performance features you paid for to not leak data.

    • by VaccinesCauseAdults ( 7114361 ) on Tuesday October 15, 2024 @07:14PM (#64867529)
      Off topic, but regarding your profile quote. They seem to have finally fixed the issue in the last few months. Quotes and apostrophes finally work — here’s an “example” with em-dashes too. This is written directly on an iPhone.
      • Off topic, but regarding your profile quote. They seem to have finally fixed the issue in the last few months

        It must be the last two weeks or so, because less than that long ago I pasted some text from a webpage that got mangled.

      • by reanjr ( 588767 )

        Good job, Slashdot. It only took about 3 decades to figure out how encodings work. Kudos.

    • As someone earlier pointed out, this is to weaken ARM. While the marketing terms imply this alliance will be 'customer' focused, it will be nothing of the sort.

  • For Intel and AMD to join together, they must really be scared of ARM or even RISC V (even though RISC V sucks, it has potential).

    • by Misagon ( 1135 )

      There are at least half-a-dozen RISC-V cores out there at fabless semiconductor companies that (on paper, at least) should be able to compete pretty well with x86, if only they found their way into actual silicon...
      "if only"... because that latter part is the harder part.

  • RISC-V is on a trajectory to eat everybody's lunch.

    If not for performance than for security, then cost.

    Since the x86 abi sits on top of a RISC microcode these days I wonder what would happen if somebody tried it on the RISC-V architecture.

    $99 laptops probably.

    • Oh! now there is an idea ! somebody should make a RISC extension that adds hardware acceleration for processing x86 emulation

Do molecular biologists wear designer genes?

Working...