Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel Education Programming Hardware

Intel Quietly Discontinues Galileo, Joule, and Edison Development Boards (intel.com) 95

Intel is discontinuing its Galileo, Joule, and Edison lineups of development boards. The chip-maker quietly made the announcement last week. From company's announcement: Intel Corporation will discontinue manufacturing and selling all skus of the Intel Galileo development board. Shipment of all Intel Galileo product skus ordered before the last order date will continue to be available from Intel until December 16, 2017. [...] Intel will discontinue manufacturing and selling all skus of the Intel Joule Compute Modules and Developer Kits (known as Intel 500 Series compute modules in People's Republic of China). Shipment of all Intel Joule products skus ordered before the last order date will continue to be available from Intel until December 16, 2017. Last time orders (LTO) for any Intel Joule products must be placed with Intel by September 16, 2017. [...] Intel will discontinue manufacturing and selling all skus of the Intel Edison compute modules and developer kits. Shipment of all Intel Edison product skus ordered before the last order date will continue to be available from Intel until December 16, 2017. Last time orders (LTO) for any Intel Edison products must be placed with Intel by September 16, 2017. All orders placed with Intel for Intel Edison products are non-cancelable and non-returnable after September 16, 2017. The company hasn't shared any explanation for why it is discontinuing the aforementioned development boards. Intel launched the Galileo, an Arduino-compatible mini computer in 2013, the Edison in 2014, and the Joule last year. The company touted the Joule as its "most powerful dev kit." You can find the announcement posts here.
This discussion has been archived. No new comments can be posted.

Intel Quietly Discontinues Galileo, Joule, and Edison Development Boards

Comments Filter:
  • I'm not surprised Intel is doing this. When your competition for IoT devices includes widely available Arduino, Raspberry Pi and other simple, cheap boards with legions of followers? Embedded stuff is either going to COTS (Common Off The Shelf) stuff or very highly customized. At least that's my thought.

    • by H3lldr0p ( 40304 ) on Monday June 19, 2017 @12:37PM (#54648699) Homepage

      Exactly. Why buy into an ecosystem that's not as flexible as the others. Did they offer superior documentation and support? Superior integration? Anything at all aside the brand?

      Guessing the bosses at the top want to retrench and focus on their server & consumer spaces now that AMD has shaken up the market once more. Despite this being a tiny space, I doubt it ever made enough money to justify the ongoing costs needed to crowd out all of the established open hardware.

      • by Chris Katko ( 2923353 ) on Monday June 19, 2017 @04:05PM (#54650283)

        To add onto your post,

        When I was in college, I backed/bought a 3rd party board. It was faster than Arduino but pin compatible. It was before there were so many options but the experience is still applicable.

        I bought it, and ran into problems. The hardware was fine but the SOFTWARE chain had problems crashing the IDE, and flashing/detecting the serial port. It was a pain in the ass. "Go online and search for a fix" doesn't actually work when: There's like 10 people at the company and

        Another slap to the face? I realized I had bought a "Beta" board. They said it could have problems but it was tested and sound. The problem? They then produced the "official" board which wasn't pin or software compatible with the Beta board.

        So I spent $60-80... on a paperweight that can't be programmed.

        Additionally, there are zero 3rd party tutorials, almost zero forums with knowledge of the device. It's almost impossible to crowdsource a problem with it.

        Another problem? Just like Intel here, what happens if the product is discontinued or no longer supported by the company?

        I've learned the hard way that you're not buying a product, you're buying a PLATFORM. And the platform (documentation, official and third-party support, hardware, and more?) needs to be heavily entwined in your cost/benefit calculations. It can't just be "speed vs cost."

        As I've looked for better Arduino and Raspberry Pi's, I've consistently applied that logic and found zero viable alternatives. Even if they could compete on cost, they can't compete on TIME investment. There are thousands of arduino/pi tutorials. Good official documentation. Thousands of active programmers to assist you and over a decade of toolchain support.

        I've been learning the D language over the last year or two. I love it (except the garbage collector which adds an additional entire dimension to crash solving). Otherwise, it's pretty amazing (so much so the C++ committee adds features that D had for over a decade). They have one great forum and StackOverflow probably can solve it. But that's kind of it. There aren't dozens of _maintained_ D XML parsing libraries. Dozens of JSON libraries. Dozens of game programming libraries. Dozens of X/Y/Z libraries. In C and C++ you have your pick of the litter. Any possible question, no matter how niche, has a C/C++ library. Library for the reverse engineered Kinect2 sensor? Yep. But while D can interface cleanly with C, it doesn't support C++. And that's a huge flaw because it cuts you off from basically "Almost every library ever written" in the last three decades. Programming in D is a delight, but you HAVE to re-invent the wheel for things that come for free in C/C++. So I've been very hesitant to switch over completely to D. What happens if the community dies out? Do I really want to write a hobby game in a dead language? (There is a fork of LLVM based LDC, Calypso, which integrates Clang with LDC for C++ support. And it's a highly watched project. But it's even more niche. Do I hedge my game on an almost-niche language, with a niche fork of a compiler that is 300 commits behind the official LDC branch? What if I run into a bug that is solved in the new branch but not the fork? I'm relying on a lot of guys charity work in my build chain.)

        So if I can distill all my points down to one: "For production, buy what's popular--not what's clever."

        • by Megane ( 129182 )

          I've learned the hard way that you're not buying a product, you're buying a PLATFORM.

          This is one of the reasons that I've stuck with the mbed platform. From the time five-ish years ago that an NXP rep left behind an NXP-1768 MBED at my work (fuck the 1st gen LPCXpressos that he left too, their debug interfaces sucked, and not worth working around), I found a good paradigm of using C++ that I was able to apply to my own embedded coding at work. Best of all, it was system-agnostic (programmed via copying a binary to a USB filesystem), which meant it didn't require Windows, like so many micro-

      • by Lumpy ( 12016 )

        Nope, in fact Intel had the crappiest support and documentation available. Almost nobody used their stuff.

        • Kind right, kinda wrong.

          I own every Intel device mentioned, and just about every other damn variant of IoT processing boards from complex devices with operating systems, down to the bare bones Atmel and PIC micro driven ones. The Intel boards are pretty damn wonderful. I never thought they'd be around for long though- the margins on those things just aren't what Intel is in the business for. The maker market is way too small for them.

          That being said, the chips that power the said Intel maker boards sell
          • Intel just isn't the right kind of company to succeed in the Maker market, but I will miss the availability of their processors on clean and cheap development boards.

            Sigh, yeah. I think the Atom/Quark combo on the Edison had tremendous potential. The Atom running Linux for the heavy lifting (yet has full access to the I/O), and the Quark for the 10% of the things that actually need to be real-time. Nice.

            Sparkfun did a lot to overcome the prototyping problem introduced by that damned connector, but I think Intel lost the war in terms of perceptions. Any Maker-class guy takes one look at that connector and wonders how in the hell he's going to overcome that hurdle, and

            • Sigh, yeah. I think the Atom/Quark combo on the Edison had tremendous potential. The Atom running Linux for the heavy lifting (yet has full access to the I/O), and the Quark for the 10% of the things that actually need to be real-time. Nice.

              I really like "virtual memory mapping heavy lifting device paired with 1 or more coprocessors with predictable instruction timing and linear memory maps" model.
              ARM licensees have played around with it since ye olden says (ARM9s paired with ARM7TDMIs) and TI has a part line (Sitara) that pairs a modern ARM with some proprietary coprocessors for running real-time process kernels without an OS.

              Intel's Atom/Quark proc is really the best offering i've ever seen in that segment, though working with the Quark d

              • Intel's Atom/Quark proc is really the best offering i've ever seen in that segment, though working with the Quark directly without signing an NDA is a complicated mess (though one can figure it out). I am glad i bought several Edisons. Yocto may be a pile of shit, but better support will come in time, and it'll become more reasonable to roll your own OS for the Atom side, and people will figure out the Quarks, and you'll be able to do direct loads onto it without negotiating with some way-heavier-than-needed real-time OS kernel running on it.

                Unfortunately I've been around enough to see what happens when ecosystems dry up. The Linux will age and have more and more issues that the community cannot keep up with. The community will shrink by attrition, and no new blood will come in simply because you can't buy the hardware anymore. Dead end, as much as I hate to think about it.

                Turns out I only have three Edisons, and I while I'd love to dive in and figure out the potential of this wonderful part, I just can't justify the extremely limited bandwi

      • Galileo gave a 400mhz x86 with Arduino compatible I/O. It also had a solid FPU and true potential to be the ultimate core of 3d printers. If only they did Mega version, it would have been fantastic. And honestly, the FPU performance was something quite beautiful. Combined with an FPGA board, this device was a thing of absolutely beauty.

        I know it's not allowed on Slashdot to say nice things about Intel or Microsoft, but to be honest, I like the x86/Visual Studio platform when it comes to development. I suppo
    • by gigne ( 990887 ) on Monday June 19, 2017 @01:34PM (#54649189) Homepage Journal

      This sums up my experience.
      http://hackaday.com/2017/06/19... [hackaday.com]

      • by Megane ( 129182 )

        I read that this morning, and I'm going to have to agree.

        - Documentation was too hard to get, even for people who knew Intel engineers. (apparently the specific example was trying to use DMA with SPI) China gets away with a lot of poor documentation because their stuff is so cheap.

        - That damn connector may be amazingly compact, but that it also made it hard to work with. It had a limited selection of base boards unless you had a PCB engineer who could design a custom one, so you usually end up with more t

    • by ArchieBunker ( 132337 ) on Monday June 19, 2017 @02:08PM (#54649403)

      The pi uses binary blobs. It's intent was to be cheap for students, not an open source platform.

    • by Khyber ( 864651 )

      So where in your thought process dd you fail to think "Intel is probably one of the kings of COTS equipment"?

      Cuz I got news for you, the 386 while discontinued is still a hot-shit selling item for embedded shit.

      • Which might actually have been a dire warning for the people at Intel behind the Galileo and Edison devices: Both were x86; but violated enough legacy-PC expectations that the OS(es, did anyone aside from Intel's Linux branch get interested?) had to be ported; and any of the old 'basically uses DOS as an RTOS by ignoring it for time critical stuff' x86 applications were unlikely to work; plus reports on the quality of the documentation range from 'frustrating' to 'dire'.

        386s, by contrast, are markedly sl
        • Not just that, there ain't any added value in having any of the modern CPUs, like Atom, for instance, in such a box. One does not need multiple cores, MMX or SSI instructions, and it helps that the 386 just has some 100+ pins as opposed to 400+ pins. 16 bit is probably inadequate for embedded systems, but 32-bit is perfect, and doesn't need to go 64-bit, which is what modern Intel CPU architectures are.

          Incidentally, are all the 386 patents still active, or have they expired? If the latter, any fabless

          • I think that SiS' old 486-to-pentium-ish designs are carrying on as Vortex86 [dmp.com.tw]. If they aren't now; they were until comparatively recently.

            On the FPGA side, there is ao486 [github.com]. Don't know much about it; but seems similar to what you have in mind.
    • No one really cares how open a platform is. The winners of the IoT hobby world are not interested in "open". The Raspberry Pi famously runs an ARM core that is buried under NDAs and binary blobs.

      The winners in this field are determined by ecosystems and communities. The Arduino platform is quite a poor performer and their libraries were famously crap, to say nothing of the god-aweful IDE compared to AVR studio, or the stupid design decision that lead to one set of pins being off centre locking out a whole l

      • Beaglebone could have been another option. In addition to the usual linux sources, Minix was also ported to it, so that would be a fantastic platform to build on

  • by bettodavis ( 1782302 ) on Monday June 19, 2017 @12:41PM (#54648723)
    Good to remember that not long ago, Intel PR touted the IoT as the Next Big Thing and the company followed suit, with entire groups and people dedicated to having these products out the fab.

    These development platforms (the vehicle for having their IoT processors into product makers' hands) being now discontinued most likely means the sales were disappointing and that these groups probably are no more and there won't be any follow up.

    Which is not that surprising, giving Intel is used to earn a living from high margin products, not cheap stuff that needs to sell millions to make a margin.

    Seems like this market, like Mobile before it, will belong to ARM.
    • When you get a CFO who is more interested in cost cutting than innovation, experiments like IoT that have yet to see profitability get shut down.

      The next round of layoffs is going to be all the IoT groups.

    • by timholman ( 71886 ) on Monday June 19, 2017 @02:18PM (#54649489)

      These development platforms (the vehicle for having their IoT processors into product makers' hands) being now discontinued most likely means the sales were disappointing and that these groups probably are no more and there won't be any follow up.

      I don't think there was ever any serious commitment to the Galileo platform at Intel.

      I was contacted by Intel in Dec. 2014 and asked if I wanted some free Galileo boards + Grove sensor kits to evaluate for academic use. It took them six months to ship the boards to me. Three times I emailed them, and each time a different person responded, because the previous contact had transferred to another group. After many apologies, I finally got the boards in June, but Intel had missed the window of opportunity for us to incorporate them into the 2015-16 labs, nor was there anything compelling enough in their specs to make any faculty want to try them out in place of Arduinos or BeagleBoards.

      Last August, I gave one of the Intel kits to my teaching assistant to evaluate for use in our electronics lab. His report to me was that the Galileo boards were unsuitable, as their slow I/O made them unusable for the D/A conversion experiments that we needed them for. My TA then checked and found out that Intel had dropped their academic program entirely, so he built a board using a standard Atmel processor instead.

      Given the huge amount of churn in Intel personnel working on Galileo, it was painfully obvious that their academic IoT push was doomed from the get-go. Intel still wants to sell $400 processors, not $2 IoT chips, and that is clearly where the internal prestige and employee rewards are being directly within the company.

    • they tried the Curie chip but it was a flop. arduino101 has no sales, no projects and the intel 'stack' is very nonstandard and has no traction with devels.

      their expensive boards were a yawn. good technically but WAY overpriced and, given intel's history, not trustable to be around for very long.

      I DEMAND AN 'UPDATE STORY' and also SHELF SPARES to be kept around at the vendor side for years. if not, then I have no faith in your 'platform'.

      intel needs to be broken up, like the old phone company. companies

      • by Megane ( 129182 )

        fwiw, the 'blue pill' is the next big thing and intel lost out, entirely. you can pay well over $100 or you can pay $2. I know what I would do ;)

        I bought 20 of the bluepill boards a few months ago when they were mentioned on Hackaday. I plan to use them for small USB HID device projects, and I already have one working with mbed code (the CPU is equivalent to one on an ST Nucleo board) and an ST-Link v2. I've been working with STM32 since 2010, so it's like bread and butter to me, especially the F103.

    • I think that for IoT, things like RISC-V or OpenRISC might have a chance, since they are open as well. Such systems might be complete FPGAs w/ RISC-V cores in them, in an SoC configuration, which would be usable for IoT purposes.
  • by claytongulick ( 725397 ) on Monday June 19, 2017 @12:41PM (#54648731) Homepage

    I mean, on paper the specs are great, but I've actually done projects with these things and they're seriously junk. They burn out if you look at them wrong. Additionally, they have a 1.8v gpio level, so there's basically zero chance that you can use any other peripheral without level shifting.

    I've talked to a lot of other folks about them as well, they have a terrible reputation in the maker community.

    And they're expensive.

    So yeah, I'm not surprised. I abandoned them after a single project, like most other folks I know.

    • by Khyber ( 864651 )

      "they have a terrible reputation in the maker community"

      Well, duh. Anyone with a basic idea of electronics knows this is too much shit for a simple task.

  • by im_thatoneguy ( 819432 ) on Monday June 19, 2017 @01:08PM (#54648947)

    Isn't it also possible that they will be announcing and releasing a new product before December 31st?

    • by Anonymous Coward

      Would you trust a new intel iot thing for making a product if they announced end of life of this one within two years of launch? That is very fast, even when compared to something as volatile as smartphone cpus

    • Possible, but in that case, don't you make the announcement of the replacement(s) first, then you discontinue the replaced products. To avoid this kind of misunderstandings.
      • Possible, but Microsoft has discontinued tons of products to much gnashing of teeth only to release the replacement a month a later.

    • Isn't it also possible that they will be announcing and releasing a new product before December 31st?

      Oh I hope it's a CPU. It's very likely that they realised while they were chasing other businesses and resting on their laurels, and while AMD stole their lunch they realised they have done crap all in the CPU market.

    • Probably not before the end of the year, but this seems to be part of their routine. Back in the 80s, Intel had the general purpose 8051 micro-controller (and the 8048 that was in the IBM PC keyboard interface). They killed it off to focus on x86 products. Then in the late 1990s or early 2000s, they released an ARM micro-controller (XScale). That lasted a few years and they killed it off to focus on their x86 stuff. At least this time, they tried to make micro-controllers out of the x86 architecture.

  • by Anonymous Coward on Monday June 19, 2017 @01:30PM (#54649153)

    I may or may not work for the vendor of these products.
    I may or may not have had a hand in designing the chips.
    I purchased a Galileo to mess with. After all, I know the chips quite well.

    It was utterly unusable. I couldn't even light the LED. The documentation was a walkthrough of how to light the LED, but it didn't work. Involved in this was a whole software layer to make the native hardware interfaces look like some other board at the API level, which was obviously daft if you are trying to get people to know and understand the chip, so they choose to design it into products. I failed to crack through this layer of obfuscation before I gave up and did something more productive.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      I may or may not work for the vendor of these products.
      I may or may not have had a hand in designing the chips.

      No wonder they turned out to be complete turkeys if Intel's employees can't even remember who they work for, or what they do there.

    • Thanks for trying. Edison was an amazing little chunk of hardware for certain purposes (mine was low-power systems that interfaced to things with proprietary x86 drivers), but it always felt like it was one hardware guy's pet project that nobody in the software department gave half a rotten rat's ass about.

      The crap they had instead of tech support was a legendary middle finger to the customers. A bunch of clueless, barely-English-literate drones who did nothing but reply to your post about something wron

  • by e r ( 2847683 ) on Monday June 19, 2017 @01:37PM (#54649221)
    What's so pathetic about this is that they basically pulled a Microsoft-and-mobile on this.

    Arduino, BBB, and RPi had already been out for years before Intel finally figured out that there was a market there.
    Then, when they finally got off their butts they came to the party with a stupidly overpriced offering that didn't fit with the existing ecosystem.
    Why did they even try doing their own thing at all instead of helping to improve what already existed? For example, why not work with ODROID to put Intel chips on their boards instead of ARM?

    This whole thing was stupid and ham-fisted on Intel's part-- whoever the exec was that made the decision should get a stern talking-to.

    This also matches up with Intel's flailing in response to AMD's recent surge (sad as it was that AMD was on the ropes for so long).
    • What else were they going to do?

      I mean it's not like they have any competition in the CPU market so why bother working on a CPU. Find another way to make money. It's like Microsoft. They don't have any competition in the OS market, so why bother working on an OS.

      This is standard for a huge company with a monopoly. Rest on laurels until someone comes along and pulls the rug from under them.

      • by e r ( 2847683 )
        Ok, I agree with you. Getting into IoT seemed like the right move for them.
        What I'm really saying is "If they were going to half-ass it like they did then why did they even bother?".
      • Actually, they do have serious competition in the CPU market. THEMSELVES. They can't push their shit b'cos their previous shit was so good that nobody needs to replace it. Hence, the need to hunt for new markets.

        But another good business plan for Intel might be to become a TSMC or Samsung, and start fabbing chips for Qualcomm and others.

  • I would hope that the whole reason they are discontinuing these products is the realization on how they don't even compete with the arm products out there. Hell the ESP8266 showed that people will even tolerate a realistically unknown CPU instruction set, locked in firmware and a horrific manufacture SDK. It all doesn't matter if you just sell it cheap enough. So why, if Intel, wanted to compete, just slap on an atom and bare bones chips to make an IOT with a price that guarantees no one will use.

    They

    • "Hell, even just optimize the original Pentium core"

      Quark is in fact a P54C.

    • Ain't 3.3V rather old by now, particularly given how low power replaced high performance requirements at least a decade ago? Also, what voltage are chips like RISC-V or ARM? 3.3V?
  • I am a trifle surprised to see the Joule among the dead.
    The others were hopeless: too cut down(in terms of 'IBM PC' stuff), for x86 compatibility to be of much use, notably lousy at GPIO twiddling compared to microcontrollers or devices like the TI ARM part in the Beaglebone; but at least they were expensive!

    The 'Joule', though was a stock Atom part, plus some RAM, Flash, and a NIC in a little computer on module. Not based on some weirdo part; and allowed you to drop a more or less standard Atom based s
  • by dohzer ( 867770 )

    SKUs

  • I'm not heavily invested in this platform except over the past couple of years I've had a fetish for the Sparkfun blocks and "any day now" was going to open up some time to use it in a custom board for a side gig, or worst case, resume fodder. So I've got 5-6 of them laying around, the big breakout and the little breakout, and as I said, a bunch of red prototyping boards.

    At this point I don't know how I'm going to justify experimenting with all the kit I've accumulated. For all the flaws in the product-

Show me a man who is a good loser and I'll show you a man who is playing golf with his boss.

Working...