Intel Quietly Discontinues Galileo, Joule, and Edison Development Boards (intel.com) 95
Intel is discontinuing its Galileo, Joule, and Edison lineups of development boards. The chip-maker quietly made the announcement last week. From company's announcement: Intel Corporation will discontinue manufacturing and selling all skus of the Intel Galileo development board. Shipment of all Intel Galileo product skus ordered before the last order date will continue to be available from Intel until December 16, 2017. [...] Intel will discontinue manufacturing and selling all skus of the Intel Joule Compute Modules and Developer Kits (known as Intel 500 Series compute modules in People's Republic of China). Shipment of all Intel Joule products skus ordered before the last order date will continue to be available from Intel until December 16, 2017. Last time orders (LTO) for any Intel Joule products must be placed with Intel by September 16, 2017. [...] Intel will discontinue manufacturing and selling all skus of the Intel Edison compute modules and developer kits. Shipment of all Intel Edison product skus ordered before the last order date will continue to be available from Intel until December 16, 2017. Last time orders (LTO) for any Intel Edison products must be placed with Intel by September 16, 2017. All orders placed with Intel for Intel Edison products are non-cancelable and non-returnable after September 16, 2017. The company hasn't shared any explanation for why it is discontinuing the aforementioned development boards. Intel launched the Galileo, an Arduino-compatible mini computer in 2013, the Edison in 2014, and the Joule last year. The company touted the Joule as its "most powerful dev kit." You can find the announcement posts here.
Re: (Score:2)
Guess you missed the 'critical thinking' value add from school, you lazy inattentive fuck.
Re: (Score:2)
If you don't know what something is, either you need to learn about it or ignore it.
Re: (Score:2)
Okay, but how is he supposed to know which of those two to do?
Journalists used to tell you why something mattered or who it mattered to.
Now they regurgitate press releases, government statements, and Tweets.
If some celebrity tweets something and it "goes viral" Wolf Blitzer will provide around the clock coverage of the situation and there will be talking heads going on and on about it being part of the most important national conversation going on today.
Re: (Score:2)
You have to admit that the topic is at least News for Nerds this time. Even if it doesn't intersect with your own interests.
When it's not an open platform, it'll probably die (Score:5, Insightful)
I'm not surprised Intel is doing this. When your competition for IoT devices includes widely available Arduino, Raspberry Pi and other simple, cheap boards with legions of followers? Embedded stuff is either going to COTS (Common Off The Shelf) stuff or very highly customized. At least that's my thought.
Re:When it's not an open platform, it'll probably (Score:4, Insightful)
Exactly. Why buy into an ecosystem that's not as flexible as the others. Did they offer superior documentation and support? Superior integration? Anything at all aside the brand?
Guessing the bosses at the top want to retrench and focus on their server & consumer spaces now that AMD has shaken up the market once more. Despite this being a tiny space, I doubt it ever made enough money to justify the ongoing costs needed to crowd out all of the established open hardware.
Re:When it's not an open platform, it'll probably (Score:5, Insightful)
To add onto your post,
When I was in college, I backed/bought a 3rd party board. It was faster than Arduino but pin compatible. It was before there were so many options but the experience is still applicable.
I bought it, and ran into problems. The hardware was fine but the SOFTWARE chain had problems crashing the IDE, and flashing/detecting the serial port. It was a pain in the ass. "Go online and search for a fix" doesn't actually work when: There's like 10 people at the company and
Another slap to the face? I realized I had bought a "Beta" board. They said it could have problems but it was tested and sound. The problem? They then produced the "official" board which wasn't pin or software compatible with the Beta board.
So I spent $60-80... on a paperweight that can't be programmed.
Additionally, there are zero 3rd party tutorials, almost zero forums with knowledge of the device. It's almost impossible to crowdsource a problem with it.
Another problem? Just like Intel here, what happens if the product is discontinued or no longer supported by the company?
I've learned the hard way that you're not buying a product, you're buying a PLATFORM. And the platform (documentation, official and third-party support, hardware, and more?) needs to be heavily entwined in your cost/benefit calculations. It can't just be "speed vs cost."
As I've looked for better Arduino and Raspberry Pi's, I've consistently applied that logic and found zero viable alternatives. Even if they could compete on cost, they can't compete on TIME investment. There are thousands of arduino/pi tutorials. Good official documentation. Thousands of active programmers to assist you and over a decade of toolchain support.
I've been learning the D language over the last year or two. I love it (except the garbage collector which adds an additional entire dimension to crash solving). Otherwise, it's pretty amazing (so much so the C++ committee adds features that D had for over a decade). They have one great forum and StackOverflow probably can solve it. But that's kind of it. There aren't dozens of _maintained_ D XML parsing libraries. Dozens of JSON libraries. Dozens of game programming libraries. Dozens of X/Y/Z libraries. In C and C++ you have your pick of the litter. Any possible question, no matter how niche, has a C/C++ library. Library for the reverse engineered Kinect2 sensor? Yep. But while D can interface cleanly with C, it doesn't support C++. And that's a huge flaw because it cuts you off from basically "Almost every library ever written" in the last three decades. Programming in D is a delight, but you HAVE to re-invent the wheel for things that come for free in C/C++. So I've been very hesitant to switch over completely to D. What happens if the community dies out? Do I really want to write a hobby game in a dead language? (There is a fork of LLVM based LDC, Calypso, which integrates Clang with LDC for C++ support. And it's a highly watched project. But it's even more niche. Do I hedge my game on an almost-niche language, with a niche fork of a compiler that is 300 commits behind the official LDC branch? What if I run into a bug that is solved in the new branch but not the fork? I'm relying on a lot of guys charity work in my build chain.)
So if I can distill all my points down to one: "For production, buy what's popular--not what's clever."
Re: (Score:2)
I've learned the hard way that you're not buying a product, you're buying a PLATFORM.
This is one of the reasons that I've stuck with the mbed platform. From the time five-ish years ago that an NXP rep left behind an NXP-1768 MBED at my work (fuck the 1st gen LPCXpressos that he left too, their debug interfaces sucked, and not worth working around), I found a good paradigm of using C++ that I was able to apply to my own embedded coding at work. Best of all, it was system-agnostic (programmed via copying a binary to a USB filesystem), which meant it didn't require Windows, like so many micro-
Re: (Score:2)
Nope, in fact Intel had the crappiest support and documentation available. Almost nobody used their stuff.
Re: (Score:2)
I own every Intel device mentioned, and just about every other damn variant of IoT processing boards from complex devices with operating systems, down to the bare bones Atmel and PIC micro driven ones. The Intel boards are pretty damn wonderful. I never thought they'd be around for long though- the margins on those things just aren't what Intel is in the business for. The maker market is way too small for them.
That being said, the chips that power the said Intel maker boards sell
Re: (Score:2)
Intel just isn't the right kind of company to succeed in the Maker market, but I will miss the availability of their processors on clean and cheap development boards.
Sigh, yeah. I think the Atom/Quark combo on the Edison had tremendous potential. The Atom running Linux for the heavy lifting (yet has full access to the I/O), and the Quark for the 10% of the things that actually need to be real-time. Nice.
Sparkfun did a lot to overcome the prototyping problem introduced by that damned connector, but I think Intel lost the war in terms of perceptions. Any Maker-class guy takes one look at that connector and wonders how in the hell he's going to overcome that hurdle, and
Re: (Score:2)
Sigh, yeah. I think the Atom/Quark combo on the Edison had tremendous potential. The Atom running Linux for the heavy lifting (yet has full access to the I/O), and the Quark for the 10% of the things that actually need to be real-time. Nice.
I really like "virtual memory mapping heavy lifting device paired with 1 or more coprocessors with predictable instruction timing and linear memory maps" model.
ARM licensees have played around with it since ye olden says (ARM9s paired with ARM7TDMIs) and TI has a part line (Sitara) that pairs a modern ARM with some proprietary coprocessors for running real-time process kernels without an OS.
Intel's Atom/Quark proc is really the best offering i've ever seen in that segment, though working with the Quark d
Re: (Score:2)
Intel's Atom/Quark proc is really the best offering i've ever seen in that segment, though working with the Quark directly without signing an NDA is a complicated mess (though one can figure it out). I am glad i bought several Edisons. Yocto may be a pile of shit, but better support will come in time, and it'll become more reasonable to roll your own OS for the Atom side, and people will figure out the Quarks, and you'll be able to do direct loads onto it without negotiating with some way-heavier-than-needed real-time OS kernel running on it.
Unfortunately I've been around enough to see what happens when ecosystems dry up. The Linux will age and have more and more issues that the community cannot keep up with. The community will shrink by attrition, and no new blood will come in simply because you can't buy the hardware anymore. Dead end, as much as I hate to think about it.
Turns out I only have three Edisons, and I while I'd love to dive in and figure out the potential of this wonderful part, I just can't justify the extremely limited bandwi
Re: (Score:3)
I know it's not allowed on Slashdot to say nice things about Intel or Microsoft, but to be honest, I like the x86/Visual Studio platform when it comes to development. I suppo
Re: (Score:2)
RISC-V needs to branch a lot more than instruction sets with conditional instructions, and that would mess with pipelines and such.
Re: When it's not an open platform, it'll probably (Score:5, Informative)
This sums up my experience.
http://hackaday.com/2017/06/19... [hackaday.com]
Re: (Score:2)
I read that this morning, and I'm going to have to agree.
- Documentation was too hard to get, even for people who knew Intel engineers. (apparently the specific example was trying to use DMA with SPI) China gets away with a lot of poor documentation because their stuff is so cheap.
- That damn connector may be amazingly compact, but that it also made it hard to work with. It had a limited selection of base boards unless you had a PCB engineer who could design a custom one, so you usually end up with more t
Re:When it's not an open platform, it'll probably (Score:5, Informative)
The pi uses binary blobs. It's intent was to be cheap for students, not an open source platform.
Re: (Score:2)
So where in your thought process dd you fail to think "Intel is probably one of the kings of COTS equipment"?
Cuz I got news for you, the 386 while discontinued is still a hot-shit selling item for embedded shit.
Re: (Score:3)
386s, by contrast, are markedly sl
Re: (Score:2)
Not just that, there ain't any added value in having any of the modern CPUs, like Atom, for instance, in such a box. One does not need multiple cores, MMX or SSI instructions, and it helps that the 386 just has some 100+ pins as opposed to 400+ pins. 16 bit is probably inadequate for embedded systems, but 32-bit is perfect, and doesn't need to go 64-bit, which is what modern Intel CPU architectures are.
Incidentally, are all the 386 patents still active, or have they expired? If the latter, any fabless
Re: (Score:2)
On the FPGA side, there is ao486 [github.com]. Don't know much about it; but seems similar to what you have in mind.
Openness has nothing to do with it (Score:2)
No one really cares how open a platform is. The winners of the IoT hobby world are not interested in "open". The Raspberry Pi famously runs an ARM core that is buried under NDAs and binary blobs.
The winners in this field are determined by ecosystems and communities. The Arduino platform is quite a poor performer and their libraries were famously crap, to say nothing of the god-aweful IDE compared to AVR studio, or the stupid design decision that lead to one set of pins being off centre locking out a whole l
Re: (Score:2)
Beaglebone could have been another option. In addition to the usual linux sources, Minix was also ported to it, so that would be a fantastic platform to build on
The end of the IoT road at Intel? (Score:5, Interesting)
These development platforms (the vehicle for having their IoT processors into product makers' hands) being now discontinued most likely means the sales were disappointing and that these groups probably are no more and there won't be any follow up.
Which is not that surprising, giving Intel is used to earn a living from high margin products, not cheap stuff that needs to sell millions to make a margin.
Seems like this market, like Mobile before it, will belong to ARM.
Re: (Score:2)
When you get a CFO who is more interested in cost cutting than innovation, experiments like IoT that have yet to see profitability get shut down.
The next round of layoffs is going to be all the IoT groups.
Re:The end of the IoT road at Intel? (Score:5, Interesting)
I don't think there was ever any serious commitment to the Galileo platform at Intel.
I was contacted by Intel in Dec. 2014 and asked if I wanted some free Galileo boards + Grove sensor kits to evaluate for academic use. It took them six months to ship the boards to me. Three times I emailed them, and each time a different person responded, because the previous contact had transferred to another group. After many apologies, I finally got the boards in June, but Intel had missed the window of opportunity for us to incorporate them into the 2015-16 labs, nor was there anything compelling enough in their specs to make any faculty want to try them out in place of Arduinos or BeagleBoards.
Last August, I gave one of the Intel kits to my teaching assistant to evaluate for use in our electronics lab. His report to me was that the Galileo boards were unsuitable, as their slow I/O made them unusable for the D/A conversion experiments that we needed them for. My TA then checked and found out that Intel had dropped their academic program entirely, so he built a board using a standard Atmel processor instead.
Given the huge amount of churn in Intel personnel working on Galileo, it was painfully obvious that their academic IoT push was doomed from the get-go. Intel still wants to sell $400 processors, not $2 IoT chips, and that is clearly where the internal prestige and employee rewards are being directly within the company.
Re: (Score:3)
they tried the Curie chip but it was a flop. arduino101 has no sales, no projects and the intel 'stack' is very nonstandard and has no traction with devels.
their expensive boards were a yawn. good technically but WAY overpriced and, given intel's history, not trustable to be around for very long.
I DEMAND AN 'UPDATE STORY' and also SHELF SPARES to be kept around at the vendor side for years. if not, then I have no faith in your 'platform'.
intel needs to be broken up, like the old phone company. companies
Re: (Score:2)
fwiw, the 'blue pill' is the next big thing and intel lost out, entirely. you can pay well over $100 or you can pay $2. I know what I would do ;)
I bought 20 of the bluepill boards a few months ago when they were mentioned on Hackaday. I plan to use them for small USB HID device projects, and I already have one working with mbed code (the CPU is equivalent to one on an ST Nucleo board) and an ST-Link v2. I've been working with STM32 since 2010, so it's like bread and butter to me, especially the F103.
Re: (Score:2)
Probably because they're crap (the Edison) (Score:5, Informative)
I mean, on paper the specs are great, but I've actually done projects with these things and they're seriously junk. They burn out if you look at them wrong. Additionally, they have a 1.8v gpio level, so there's basically zero chance that you can use any other peripheral without level shifting.
I've talked to a lot of other folks about them as well, they have a terrible reputation in the maker community.
And they're expensive.
So yeah, I'm not surprised. I abandoned them after a single project, like most other folks I know.
Re:Please explain "level shifting" (Score:5, Informative)
Arduinos typically represent logical bits using 5 volts. When purchasing devices that work with Arduinos (such as sensors) manufacturers will develop those sensors to communicate using 5 volts as well. Raspberry PIs actually use 3.3v to represent bits, so you'll often see manufacturers develop both 5v and 3.3v versions of devices. Level-shifters are the equivalent of adapters - they sit between two devices that use separate voltage levels to exchange data and "shift" them to the correct voltage.
So, if Intel's boards use 1.8v, this makes it harder to use existing sensors and other devices made for PIs and Arduinos.
Re: (Score:1)
AKA basic electronics that any regular visitor to this board should know. In fact, it's one of the most basic pieces of electrical knowledge out there. Anyone asking that question on this board likely does not belong here.
Re: (Score:2)
I noticed several times in discussions on social media that people who designed boards and systems didn't understand basic EE and how transistors work. Maybe they should leave that to people who know what they're doing? I shudder thinking about how they'll do with harder subjects like transmission lines.
Re: (Score:2)
Great attitude, Mr. Khyspergers'. [/sarcasm tag for the *obviously* impaired]
Re: (Score:2)
One thing, though: if cost & power are important to IoT, like they usually are to embedded systems, wouldn't they have all moved to at least 3.3V by now? I used to be in the Flash memory business up to 10 years ago, and while we'd initially sell 5V flash, the market moved completely to 3V - from things like PCs to optical drives and the like. I'm surprised to read that Arduino, or any other ARM based embedded system, would still be at 5V, when the rest of the stuff is at 3.3V
By being at 1.8V, Intel
Re:Please explain "level shifting" (Score:4, Insightful)
It means that the GPIO is not
A GPIO is a general-purpose input/output
An input/output pin is
I give up. Go read stuff on MSNBC, reddit or somewhere else.
Re: (Score:2)
GPIO means general-purpose input/output.
A GPIO pin is simply a pin you can connect to and do I/O with.
Typically, you connect these to other components to do whatever stuff you need to do. But they need to agree on voltage levels.
I don't do any of this shit, but if Intel went against the grain and requires a voltage that no one else uses, it would be a moderate pain in the ass to connect to their GPIO pins.
Re: (Score:2)
Underlying point being that the voltage of the IO's can be different from the Vcc levels, w/ internal chip level shifters. This is done if most of the off-the-shelf components are still at a different voltage than the chip in question
Re: (Score:2)
"they have a terrible reputation in the maker community"
Well, duh. Anyone with a basic idea of electronics knows this is too much shit for a simple task.
Re: (Score:2)
This assumes they were meaningful and actual competition.
The documentation was bad, the prices were uncompetitive, and this lead to ~0 market share.
Re: (Score:2)
SparkFun sold these products. Do you think you were going to get a critical review about this product from one of their larger resellers? Of course SparkFun is going to present them as something exceptionally good. The purpose of those videos to to advertise the products in them.
Replacements? (Score:3)
Isn't it also possible that they will be announcing and releasing a new product before December 31st?
Re: Replacements? (Score:1)
Would you trust a new intel iot thing for making a product if they announced end of life of this one within two years of launch? That is very fast, even when compared to something as volatile as smartphone cpus
Re: (Score:2)
If the old thing is x86 and the new thing is x86, the internals won't really matter.
Re: (Score:3)
Re: (Score:2)
Possible, but Microsoft has discontinued tons of products to much gnashing of teeth only to release the replacement a month a later.
Re: (Score:2)
Isn't it also possible that they will be announcing and releasing a new product before December 31st?
Oh I hope it's a CPU. It's very likely that they realised while they were chasing other businesses and resting on their laurels, and while AMD stole their lunch they realised they have done crap all in the CPU market.
Re: (Score:1)
Probably not before the end of the year, but this seems to be part of their routine. Back in the 80s, Intel had the general purpose 8051 micro-controller (and the 8048 that was in the IBM PC keyboard interface). They killed it off to focus on x86 products. Then in the late 1990s or early 2000s, they released an ARM micro-controller (XScale). That lasted a few years and they killed it off to focus on their x86 stuff. At least this time, they tried to make micro-controllers out of the x86 architecture.
Re:Quietly? (Score:4, Informative)
Standard procedure for bad news is to post it to your press/corporate site on a Friday, but not actively tell anyone.
Standard procedure for good news (or new product news), is to hint, tease, and preannounce, then reveal early in the week with announcements on the press/corporate site, emails to journalists, branding and news "articles" on the main site, etc. Throw in some reviewers / tech "journalists" who are suckling at your teat and willing to sign NDAs and you'll have tons of coverage ready to go when you want it.
Posting AC Obviously. (Score:4, Interesting)
I may or may not work for the vendor of these products.
I may or may not have had a hand in designing the chips.
I purchased a Galileo to mess with. After all, I know the chips quite well.
It was utterly unusable. I couldn't even light the LED. The documentation was a walkthrough of how to light the LED, but it didn't work. Involved in this was a whole software layer to make the native hardware interfaces look like some other board at the API level, which was obviously daft if you are trying to get people to know and understand the chip, so they choose to design it into products. I failed to crack through this layer of obfuscation before I gave up and did something more productive.
Re: (Score:2, Funny)
I may or may not work for the vendor of these products.
I may or may not have had a hand in designing the chips.
No wonder they turned out to be complete turkeys if Intel's employees can't even remember who they work for, or what they do there.
Re: (Score:3)
Thanks for trying. Edison was an amazing little chunk of hardware for certain purposes (mine was low-power systems that interfaced to things with proprietary x86 drivers), but it always felt like it was one hardware guy's pet project that nobody in the software department gave half a rotten rat's ass about.
The crap they had instead of tech support was a legendary middle finger to the customers. A bunch of clueless, barely-English-literate drones who did nothing but reply to your post about something wron
Why did they even bother? (Score:4, Insightful)
Arduino, BBB, and RPi had already been out for years before Intel finally figured out that there was a market there.
Then, when they finally got off their butts they came to the party with a stupidly overpriced offering that didn't fit with the existing ecosystem.
Why did they even try doing their own thing at all instead of helping to improve what already existed? For example, why not work with ODROID to put Intel chips on their boards instead of ARM?
This whole thing was stupid and ham-fisted on Intel's part-- whoever the exec was that made the decision should get a stern talking-to.
This also matches up with Intel's flailing in response to AMD's recent surge (sad as it was that AMD was on the ropes for so long).
Re: (Score:2)
What else were they going to do?
I mean it's not like they have any competition in the CPU market so why bother working on a CPU. Find another way to make money. It's like Microsoft. They don't have any competition in the OS market, so why bother working on an OS.
This is standard for a huge company with a monopoly. Rest on laurels until someone comes along and pulls the rug from under them.
Re: (Score:2)
What I'm really saying is "If they were going to half-ass it like they did then why did they even bother?".
Re: (Score:2)
Actually, they do have serious competition in the CPU market. THEMSELVES. They can't push their shit b'cos their previous shit was so good that nobody needs to replace it. Hence, the need to hunt for new markets.
But another good business plan for Intel might be to become a TSMC or Samsung, and start fabbing chips for Qualcomm and others.
Not surpised (Score:2)
I would hope that the whole reason they are discontinuing these products is the realization on how they don't even compete with the arm products out there. Hell the ESP8266 showed that people will even tolerate a realistically unknown CPU instruction set, locked in firmware and a horrific manufacture SDK. It all doesn't matter if you just sell it cheap enough. So why, if Intel, wanted to compete, just slap on an atom and bare bones chips to make an IOT with a price that guarantees no one will use.
They
Re: (Score:2)
"Hell, even just optimize the original Pentium core"
Quark is in fact a P54C.
Re: (Score:2)
The Joule is an interesting kill. (Score:2)
The others were hopeless: too cut down(in terms of 'IBM PC' stuff), for x86 compatibility to be of much use, notably lousy at GPIO twiddling compared to microcontrollers or devices like the TI ARM part in the Beaglebone; but at least they were expensive!
The 'Joule', though was a stock Atom part, plus some RAM, Flash, and a NIC in a little computer on module. Not based on some weirdo part; and allowed you to drop a more or less standard Atom based s
sku (Score:2)
SKUs
Any alternatives? (Score:2)
I'm not heavily invested in this platform except over the past couple of years I've had a fetish for the Sparkfun blocks and "any day now" was going to open up some time to use it in a custom board for a side gig, or worst case, resume fodder. So I've got 5-6 of them laying around, the big breakout and the little breakout, and as I said, a bunch of red prototyping boards.
At this point I don't know how I'm going to justify experimenting with all the kit I've accumulated. For all the flaws in the product-
Re: (Score:2)
Well, I don't really give a rats ass about x86, and especially don't care about compatibility with PC software or hardware. The closest alternative is the Pi (which is ARM), and perhaps if the Yun came in a smaller form factor (which is MIPS). And for my purposes the GPU on the Pi just adds cost and power consumption.
The power of the Edison is in its form factor, low power consumption, built-in battery support, USB, and wireless. That plus a good mix of compute cores---two medium-strength cores for Linux