Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware Technology

VMware, AMD, Samsung and RISC-V Push For Confidential Computing Standards (theregister.com) 15

VMware has joined AMD, Samsung, and members of the RISC-V community to work on an open and cross-platform framework for the development and operation of applications using confidential computing hardware. The Register reports: Revealing the effort at the Confidential Computing Summit 2023 in San Francisco, the companies say they aim to bring about an industry transition to practical confidential computing by developing the open source Certifier Framework for Confidential Computing project. Among other goals, the project aims to standardize on a set of platform-independent developer APIs that can be used to develop or adapt application code to run in a confidential computing environment, with a Certifier Service overseeing them in operation. VMware claims to have researched, developed and open sourced the Certifier Framework, but with AMD on board, plus Samsung (which develops its own smartphone chips), the group has the x86 and Arm worlds covered. Also on board is the Keystone project, which is developing an enclave framework to support confidential computing on RISC-V processors.

Confidential computing is designed to protect applications and their data from theft or tampering by protecting them inside a secure enclave, or trusted execution environment (TEE). This uses hardware-based security mechanisms to prevent access from everything outside the enclave, including the host operating system and any other application code. Such security protections are likely to be increasingly important in the context of applications running in multi-cloud environments, VMware reckons.

Another scenario for confidential computing put forward by Microsoft, which believes confidential computing will become the norm -- is multi-party computation and analytics. This sees several users each contribute their own private data to an enclave, where it can be analyzed securely to produce results much richer than each would have got purely from their own data set. This is described as an emerging class of machine learning and "data economy" workloads that are based on sensitive data and models aggregated from multiple sources, which will be enabled by confidential computing. However, VMware points out that like many useful hardware features, it will not be widely adopted until it becomes easier to develop applications in the new paradigm.

This discussion has been archived. No new comments can be posted.

VMware, AMD, Samsung and RISC-V Push For Confidential Computing Standards

Comments Filter:
  • by CrappySnackPlane ( 7852536 ) on Friday June 30, 2023 @06:49PM (#63647392)

    Let's face it, it's a real pain in the ass having to rely on exploits to make sure your crapware isn't immediately visible on Task Manager. What a truly considerate development for all those industrious and dutiful black hats out there!

    • by AmiMoJo ( 196126 )

      If you read the spec it is obvious that this would make it harder to hide malware. To access it the malware would need to be signed with a key that at least the OS trusts, and then make use of a special framework that would make it stand out like a sore thumb.

  • I'm sure this will work out swell for everyone else.

    • Confidential computing is designed to protect applications and their data from theft or tampering by protecting them inside a secure enclave, or trusted execution environment (TEE)

      Today I made an appearance downtown
      I am a trusted environment because I say I am
      And I said gentlemen, and I use that word loosely
      I will compute for you, I'm a gun for hire, I'm a saint, I'm a liar
      Because there are no facts, there is no truth
      Just data to be manipulated
      I can get you any result you like
      What's it worth to you?
      Because there is no wrong, there is no right
      And I sleep very well at night
      No shame, no solution, no remorse, no retribution
      Just vendors selling products
      Just opportunity to part

  • Part of this sounded like the encryption that still lets you compute on hidden data (fully homomorphic encryption?). Think IBM worked on one. But it was slow (took a lot of extra work).

    Is this the new in-CPU toolbox for Blu-Ray stuff on Windows 12? I know the last Intel "trusted execution" thing is dead. And hasn't been included for a few generations. But I don't think AMD has copied it to include. And Intel didn't replace it (last I searched).

    • No, it's not homomorphic encryption.
      My understanding is that this is will be mechanism for the CPU to generate (or load) a symmetric key to transparently encrypt/decrypt a memory region and storage data. I imagine this symmetric key can only leave the CPU in encrypted form (with asymmetric algorithm), to be stored on disk.
      You will probably be able to init a secure enclave by providing your own symmetric key encrypted with the public key of the CPU.
      It will be interesting to see how this will affect v
  • ... support confidential computing ...

    Knowing nothing about this, I can only assume the idea is, files decrypt to CPU cache, not RAM. This requires every process to have 2 decryption keys. One for read-only data, (eg. software and DLL thunks) and one for user data (to encrypt/decrypt data sent to/from the operating system).

    Given the current assault on privacy, operating systems will need a kiddie-porn scanner, like iPhone planned to have. Or, software developers will have to submit their decryption keys to the goverrnment so it can decrypt

    • Re:Can only assume (Score:4, Interesting)

      by codebase7 ( 9682010 ) on Saturday July 01, 2023 @03:57AM (#63648094)

      Knowing nothing about this, I can only assume the idea is, files decrypt to CPU cache, not RAM.

      That's a TPM [wikipedia.org] not a CPU. Basic idea is Clipper Chip 3.0 but managed by some private entity instead of the government. So there's even less oversight and a greater chance of compromise.

      Sure it allows user submitted keys, but the encryption algorithm is locked behind an encrypted and cryptographicly signed binary blob that is by design not user updatable. With the unique per chip master key assigned at the factory and no owner override facility. (Because the private management entity is the owner of the chip / computer / device not you.) DRM gets enforced via mandatory hypervisors, enforced use of the unique master key as the root of all trust, and memory curtaining. (Process not on the hypervisor whitelist, even the OS, cannot access the contents of curtained memory.) Go nuts malware authors, you're welcome!

      Given the current assault on privacy, operating systems will need a kiddie-porn scanner, like iPhone planned to have.

      Not needed. The government just needs to provide the FBI backdoor UEFI app and have the TPM require it to be running in hypervisor protected curtained memory before booting anything else. (With automatic shutdown if it doesn't respond to requests, just like systems today do if they cannot find / load the Intel ME / AMD PSP / ARM TrustZone binary blob(s).)

      • by AmiMoJo ( 196126 )

        That's not how TPM works. There is no factory assigned key, it's generated on request. The secured data that is stored is combined with a hash of the entire boot chain, which is why when you update your bios the data is lost and you have to enter a recovery key. Same thing will happen if your OS bootloader is modified.

        These days the TPM is usually part of the CPU. If you don't trust it then you are screwed anyway because your entire CPU is a microcode binary blob, encrypted to prevent you viewing or modifyi

  • by Anonymous Coward

    Um, the owners of the hardware would like some security protections for themselves, and the ability to audit all code running on their hardware, thanks! We do need better hardware protection mechanisms, but protection from the owner is not one of them.

    "Trusted" != "Trustworthy"; rather, they are mutually exclusive. No one wants enclaves for malware on their computer, and that is exactly what this is intended for. We are expected to implicitly "trust" that code running in black boxes, from industry, governme

  • They aren't "Confidential Computing Standards" they are "Confidential-Computing Standards". The former is, at best, ambiguous but generally would mean a computing standard which is itself confidential. The latter is a standard for computing confidentially.

    • No, the latter is a standard for convincing gullible idiots that they are computing confidentially* after handing all of their precious data to some random asshole's definitely not hacked, see it says it's not hacked, so it's not hacked OK?! data farm. After being told for the umpteenth time: "Don't put crap you want to keep private on the Internet." It's doubleplusgood!

      *: computing confidentially, i.e. hidden from all of those absolutely after their data specifically Corporate / Government espionage agen
  • ...confidential computing programming languages, like for example Confidential C (CC). In these languages, functions like "print" and "write" do not exist at all, to protect confidentiality of the computations.
  • Surprised no one here got the "code-word". " Confidential Computing Hardware" really means "Locked Down Computing". Sure the marketing article looks good, but we here should know where this is headed.
  • Sounds like the hardware companies want to be the only ones to have access to everyone's data. They can combine your data with the data of others, file exclusive patents on the combined work that none of the individuals would warrant on their own thereby cementing their rule of their individual realms and domains.

On the eighth day, God created FORTRAN.

Working...