Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Intel The Internet Hardware

Intel Announces Atom E3900 Series - Goldmont for the Internet of Things (anandtech.com) 68

Intel has announced the Atom E3900 series. Based upon the company's latest generation Goldmont Atom CPU core, the E3900 series will be Intel's most serious and dedicated project yet for the IoT market. AnandTech adds: So what does an IoT-centric Atom look like? By and large, it's Broxton and more. At its core we're looking at 2 or 4 Goldmont CPU cores, paired with 12 or 18 EU configurations of Intel's Gen9 iGPU. However this is where the similarities stop. Once we get past the CPU and GPU, Intel has added new features specifically for IoT in some areas, and in other areas they've gone and reworked the design entirely to meet specific physical and technical needs of the IoT market. The big changes here are focused on security, determinism, and networking. Security is self-evident: Intel's customers need to be able to build devices that will go out into the field and be hardened against attackers. Bits and pieces of this are inerieted from Intel's existing Trusted Execution Technology, while other pieces, such as boot time measuring, are new. The latter is particularly interesting, as Intel is measuring the boot time of a system as a canary for if it's been compromised. If the boot time suddenly and unexpectedly changes, then there's a good chance the firmware and/or OS has been replaced.
This discussion has been archived. No new comments can be posted.

Intel Announces Atom E3900 Series - Goldmont for the Internet of Things

Comments Filter:
  • More CPU power for the next DDoS.

    • A DDOS doesn't require CPU power, but maybe you were trying to be funny.

    • This isn't really "IoT". A home router is not IOT either. Yes it's a thing, and it's on the internet, but so is your computer and your phone. This new CPU sounds like a power hog focused on being fast, whereas I'm working on something that has to run on a non-rechargeable battery for twenty years. Intel is focusing on the consumer IoT fad I suspect.

      What an IoT chip *should* have: low power and security. Meaning average only a handful of microamps, the fewer the better as it allows customers to add the

  • by Anonymous Coward

    "Inerieted"? That's an odd word. Did the writer perhaps mean to type "inebriated"?

    • by Duhfus ( 960817 )
      I suspect they meant "inherited".
      • by Chrisq ( 894406 )

        "Inerieted"? That's an odd word. Did the writer perhaps mean to type "inebriated"?

        I suspect they meant "inherited".

        But they probably were inebriated

  • I wonder how useful having the time it takes to boot be a measurement if a ROM is compromised or not.

    For example, assuming the ROM uses Linux and has a few writable partitions, if it boots up and does a fsck, or just replays filesystem transaction logs, this will almost certainly be different each boot, especially if the system had a dirty shutdown.

    However, if the timing is measured from the OS boots until it mounts the read-only RAMdrive and gets ready to load the main OS, that is a lot more predictable.

    • by arth1 ( 260657 )

      I wonder how useful having the time it takes to boot be a measurement if a ROM is compromised or not.

      You mean system, not ROM. ROM cannot be compromised unless physically replaced, as it by definition is read-only.

      And all this will do is make any startup commands for malware run detached with a delay. That's child's play.

      But, as you allude to, it will likely lead to lots of false positives, as startup can depend on not only things like file system checks, but external factors like SSID broadcast frequency, DHCP response time, and various other factors.

      • by mlts ( 1038732 )

        s/ROM/firmware/g. In any case, a lot of malware remains in RAM. Yes, a reboot will fix it, but it can likely be added again, especially if compromised devices scan each other and re-compromise devices that were rebooted, but still vulnerable. Protecting the boot sequence does help, as firmware reflashes can be nasty and impossible to get rid of. However, what is needed is some thought is perhaps looking at a hypervisor and limiting what each machine/container has access to. For example, one container m

    • Many of the recent attacks are RAM-only, so simply rebooting the device gets rid of the malware.

  • by lobiusmoop ( 305328 ) on Wednesday October 26, 2016 @09:43AM (#53154407) Homepage

    That's damn hungry for IoT...
    Meanwhile, ARM announces Cortex M23 [cnx-software.com] potentially capable on running purely on harvested energy alone apparently.

    • by c ( 8461 )

      That's damn hungry for IoT...

      Yeah... while Intel's trying to breed T-rex's, the rest of the IoT world is standardizing on velociraptors.

  • I may have missed something but isn't a target TDP of 6.5-12 watts a little too much for IOT? I often read about ARM based boards with TDP on 0,9-2 watts. Ok those Atoms may be most probably far powerful, but their TDP still looks like a little too high.
    • by I4ko ( 695382 )
      Yeah, what's the point. At 6 to 12W one can ran comfortably a Haswell U chip. Even the Celerons will have more oomph than the crappy atom cores in this one. Oh.. I get it.. it is more spying via the integrated management engine. Cause in IoT the users are the product.
    • From TFA:

      "the launch of the Atom E3900 series brings with it Intel’s first custom silicon targeting the roughly 6W to 12W market of more powerful IoT devices... ...As relatively high power processors these aren’t meant for wearables and such, but rather primarily devices on mains power where additional intelligence is needed. In Intel terminology, the E3900 is focused on “edge” devices as opposed to “core” devices. The idea being that Intel wants to move out data processi

      • by gtall ( 79522 )

        I give up, just what exactly are those "edge" devices? Refrigerators? Air conditioning units? Even if it is these sorts of things, connecting them to the internet is asking for trouble seeing as security is not nailed down just yet. And what refrigerator company wants to ramp up their software effort to take advantage of the extra power. What will it buy them, what do consumers get out of it? They will probably buy from some middleman who wants to sell millions of processing units....which means cost per un

    • It's Intel. When most people say IoT, they mean 'embedded thing that can run a network stack, low power, probably powered by batteries'. When Intel says IoT, they mean something subtly different: 'computer, plugged into the mains, probably running Windows'. The overlap between the two is that they're both talking about insecure systems connected to the Internet.
      • Why would it have to be Windows? There are plenty of embedded OSs that are x86 (even x86 only) and which would be perfectly happy w/ Goldmont: Minix, QNX, WindRiver, et al. A lot of them are not ported to ARM, and Goldmont would be just perfect for them
    • The newest generation Skylake and Kaby Lake u-series processors default to a 15w TDP and can be configured as a 7.5w TDP-Down. This gets you a full powered Core-i processor, not a weakened Atom chip. Is this device still a hampered weakened Atom-type chip or have they added some IoT centric features to a Core-i?
  • money wasted. (Score:4, Insightful)

    by Gravis Zero ( 934156 ) on Wednesday October 26, 2016 @10:03AM (#53154567)

    Intel doesn't understand what businesses want: inexpensive parts.
    Intel doesn't understand what hobbyists want: inexpensive parts that don't need NDAs.
    Intel doesn't understand what the world doesn't need: more power hungry x86 platforms.
    Intel doesn't understand that we don't need them.

    • by harrkev ( 623093 )

      Intel doesn't understand that we don't need them.

      Yes, they understand that. That is why Slashdot announced that Intel was killing Atom chips just six months ago.

      https://mobile.slashdot.org/st... [slashdot.org]

      However, now they have a new Atom chip. What is going on? Did they kill Atom, or didn't they? Me confused.

      • The article mentions two specific Atom chips for a specific (smartphones) purpose. It could also be that they've killed development, but that this chip was already in production so it might as well be pushed out because humans are pretty bad about chasing a sunk cost with more money in some kind of idiotic hope. Now that they've pushed their latest stillborn child out into the world, Intel can hardly declare it a giant turd, so marketing comes up with some buzzwords or an intended purpose because they've go
    • Since when were Atoms power hungry x86s?
  • One slide says "Enhanced IOs", but what kind of I/Os are there ? I hope there are GPIOs, TW, SPI, etc.
  • by ( 4475953 )

    Man, I hate the chip market. I want to have an affordable 6 to 12 core chip with 5 to 6 GHz default clock rate, not this low-powered Internet of things crap. I hope AMD comes up with something soon that will make them have to take into account some competition again, or else we will be stuck with slow desktops forever.

    • by arth1 ( 260657 )

      Man, I hate the chip market. I want to have an affordable 6 to 12 core chip with 5 to 6 GHz default clock rate, not this low-powered Internet of things crap.

      I don't want overkill. I want something stable, that won't need to be encased in a cubic meter of gold/lead alloy to be protected from cosmic rays because the fab die has decreased to barely usable. Something that will last for 15+ years, while delivering enough umph, but not orders of magnitude more than I need.

      My main server is a PIIIs, and as it still runs the latest software, why would I need new hardware that's less reliable? It is more than enough to handle DNS, DHCP, internal web, incoming e-mail

    • I feel like you completely missed the point of this platform (embedded devices do not need shittons of CPU... ) but what, exactly, are you doing that requires 12 cores at 6GHz because your raw CPU is the bottleneck in a Desktop platform or architecture??? 'Slow desktops forever' my ass... 16G RAM, SSDs, and bus-level access are commodity product specs now. If you have Desktop software that is slow because of a CPU bottleneck, you either have shitty code, or your software should never have been designed for
      • Flight simulation, audio processing with huge number of tracks and soft synths, compiling large software projects, stuff like that. It's really not hard to get to the limits of off-the-shelf PCs and I don't have the money for a 5k workstation either.
  • Does anyone else see a "Microsoft Azure App Service" Sponsored Content entry right beneath this one on the main page?

  • What Intel's definition of IoT?

    This looks like a human interface device (GUI) with some hardware control capability (enhanced determinism) rather then an embedded MCU.

    Perhaps sth that can be use for example on a drone for higher level functions such as command, navigation, video (CV) ?

  • 12 or 18 EU configurations of Intel’s Gen9 iGPU

    What is an EU in this context? Execution Unit?

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...