Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

Video Predicting the Future of Electronics and IT by Watching Component Demand (Video) 41

Video no longer available.
A big question college students should be asking is, "What IT and electronics knowledge will be most in demand five or six years from now?" In these fast moving niches, an answer is almost impossible to come by. But what if you were one of the people who supplied raw components to the electronics industry? Wouldn't you have a better handle than most on what kind of devices and components are becoming more popular among prototypers and engineers? And wouldn't watching those trends possibly give you at least a little insight into what the future might hold? Randy Restle, Director of Applications Engineering at component supplier Digi-Key Corporation, carefully tracks orders and tries to determine what's hot and what's not. His reason for doing so is to figure out what Digi-Key should stock in coming months and years. But his insights can also be used to decide what you might want to study or -- if you're already working in the field -- what products you or your company should consider developing. Digi-Key also has an online video library where they feature new products and give ideas of what you can do with them. Even if you're not an engineer or electronics hobbyist, it's fun to see what's available but may not have hit the mass market quite yet.

Robin: So what should I be looking at as a trend, the biggest trend right now?

Randy: I knew you were going to ask me this, so I have come up with a couple of suggestions, and there are ranges, there are things from a certain software, things that are happening in software that are affecting hardware, as well as wireless as well as technology. So let me start with the technology.

Robin: Okay.

Randy: In technology there is a device technology, gallium nitride, that is used an awful lot in high brightness LEDs, but what a lot of people don’t know is that this technology makes incredible MOS tech devices. And so we’ve got a supplier that just today is announcing a new device, and what these do is the surface mode current transition, so these are incredibly small high-powered devices that operate in over100 kilohertz very high frequency operation and tremendous current carrying capabilities. Those are enhanced gallium nitride devices that are exciting to me anyway.

Robin: And what do they change for the rest of the world?

Randy: What they do is they make power supply packages more efficient, smaller, faster. So if I had a motor controller I can operate that CW in at a higher rate, I can get tighter and stiffer motor shaft control or smaller. I don’t know if you have heard of Another Geek Moment; these are videos that we have online and we’ve got a fellow who is demonstrating the power of these devices, and he actually has made it into the old carbon arc lamps that was used in the Civil War days, where you can see that just those little devices is ____2:36, and so after finishing that little demonstration, he took that thing’s circuit and did some welding with it, so this is just a small little device that’s smaller than my little finger now.

Robin: Wow! Okay, I think we will just splice that video in here, won’t we?

Randy: It’s a surface mode into the device so it doesn’t heat up the device, so it can carry much more current. That’s key.

Robin: Wow! Okay, and what about in the computer and IT side, what’s big there?

Randy: So IT. What customers are doing increasingly is to buy and base designs at least initially on a single board computer. So it is not exactly IT but it is very much influenced by things going on in IT. So these are standard operating systems and such. You know, the world’s leading semiconductor suppliers believe the world is becoming the internet of things. So 32-bit processors are becoming prevalent. These things have a complexity to them and the need for software.

You know, my application might still only require an 8-bit solution but connecting to the internet requires a level of complexity that forces these people into a higher level device. So rather than be encumbered by getting a higher end IT like operating system to run on these small devices, that today costs less than $1 in some cases, these 32-bit devices, a lot of people are going to single board computers.

I happen to have here a Beaglebone Black. This thing costs $45 quantity one, and it has got a gigabit processor on there, you can go ahead and put Linux on this and get your TCP/IP stack, you can get your UDP, your TCP, you can get email, and you can get a graphic server, you get all that functionality for about a $45 platform. But that’s not all. I mean there are just tons of these things, and just pulling up different form factors: Freescale, Atmel, TI, there are just tons of these single board computers that people are using a lot of. That’s about as close to the IT trend that I can be able to speak to.

Robin: I have a Raspberry Pi over here.

Randy: You’ve got a Raspberry Pi, that’s another one. And there are so many of these boards, Digi-Key’s application engineering team has a website, it is called eewiki.net and on that website we’ve got a matrix of all of the single board computers that we are aware of, it is dimensioned by different categories, different parameters, like clock speed, like memory to like what OS does it support. So that might be something that viewers and listeners will find value in finding.

Robin: Now somebody else in your company said that the two big buzz words coming up now are Linux obviously, that we are through with our Windows phase and we are now back to Linux which I am running at the moment, by the way, one of the two computers in front of me and Android being the other one. Oh look I have an Android!

Randy: There we go, perfect. Let me mention Android. It turns out Android is lately becoming more popular than the Linux and of course Android is a version of Linux.

Robin: Yeah.

Randy: But the reason for Android is because devices are increasingly getting these graphics displays, so rather than work out all the lower level Linux details on graphics, like a known desktop ADE and all those kind of stuff, people are going to Android because it is so well supported on graphics. There is a simple value proposition to go to Android if you’ve got a graphics device. So we see an awful lot of customers going that route.

Robin: That’s an interesting trend. We’ve seen it by the way, there are Android notebooks out there, I bought one for a granddaughter and it is very good for a kid to do homework on. It is a tablet with an integrated keyboard pretty much.

Randy: I agree. And I think there are Chromebooks too that are just incredibly high value. It is a good time to be in the market for electronics as long as you don’t really care about that underlying operating system and Linux is a good one.

Robin: I am going to ask you this, I have to, I’ve been asking people this for 15 years. What is your favorite Linux distribution?

Randy: Well, I’ve tried a ton of them and the one that I stick with more often than anything else is Debian. And I tend to go with the version that’s stable and then upgrade that so it becomes kind of like my own. I wish the Debian would roll in new versions more often. But the other thing that is just so compelling as an engineer, and I am an engineer, so at home I’m fiddling around with this stuff, is Gen2. Gen2 is a platform that an engineer loves, but boy, this takes forever and time efficiency, that it has to be a hobby to get into Gen2 is my feeling on it.

Robin: I am just a point it and click. In fact, I wrote a book once called Point and Click Linux. So I use Linux Mint which of course is you know, it is Debian with a mouse.

Randy: Yeah. And I was on the rolling edition of Mint, and it was fine except that my graphics would go down every fourth update so it just got to be too frustrating, so I have now gone with Debian.

Robin: I can see that. Well, that’s excellent. What else should we be looking out for, coming at us based on your experience?

Randy: So let’s see. We talked about the technology of enhanced gallium nitride which excites me. We talked about single board computers. We talked about Linux and Android. So that’s three of five trends that I’ve identified. Another one is ARM processors. And I’m looking over at another screen of mine and I need to find a slide that will give me the numbers.

What it amounts to is that Digi-Key stocks just a tremendous number of microprocessors. And if you count up all of the customers who use a microprocessor it is almost 100,000 users. So 100,000 different people are buying micros. Of them, three quarters of them are buying an 8-bit device. Well, if you look at how many 8-bit devices are there, and you look at their core, there are about 140 cores of 8-bit devices, which to me amazes me.

Now if you look at the ____9:38 2 bit devices that they are migrating into, there are 90 cores. So there are tremendous numbers of cores, but increasingly everybody seems to be going to ARM and there are four fundamental ARM cores, so I see a tremendous simplification in things.

I think it has given the semiconductor suppliers problems because they are competing when the core is the same. They are competing on what are the peripherals the part has, their development suite, and all these other things. So again, I think it is a good time to be an electrical engineer. Again, you can buy I think a 1000 quantity part on a 32-bit device from a couple of suppliers; it is like 49 cents. So amazing technology for a cheap price. So ARM is a juggernaut that doesn’t seem to be ending.

Robin: On my desk, ARM processors now outnumber traditional multicore of any other kind, is that the way the world’s going?

Randy: It seems that way. I will say there is one holdout 32-bit device which happens to be the most popular 32-bit device. That one happens to be from Microchip. They’re PIC32. I think they’ve got a certain compatibility; the peripherals are same on the PIC because those 8, 16 bit PICs are so popular, I think that’s what is creating so many 32-bit PIC users, but that one, it is either a PIC32 or ARM that’s what it seems like, by the numbers.

Robin: Now since most of our readers I think still work in computers, the boxes and the laptops, what devices are we talking about with the PIC32s?

Randy: So the PIC32s are going to end up being in embedded devices that they use. So they wouldn’t recognize that the part is a PIC32 on the inside. And truthfully, Digi-Key doesn’t pry into a customer’s business. We don’t ask them what is the product. So I don’t really know the range of products, but with the numbers that we sell, I can’t imagine, there would have to be tens of thousands of products based on that device as it is on ARM.

But I will tell you one thing, you pointed around a bunch of applications, that cellphone and that tablet, yeah, those are definitely ARM. And the other thing too I’ll say is the ARM has a faster clock speed, it has got more memory capacity, so it wouldn’t surprise me to find that everyone of the semiconductor suppliers who has a micro presence has to end up having an ARM representation too.

Robin: But one thing that you are reminding us here is that the computer that we think of it as a computer with the screen we look at, like we are now, is really a small minority of computing devices that’s embedded is the world now.

Randy: You are right. I used to work for Texas Instruments. This was a long time ago now. And at the time, I worked in the digital signal processing group. And we were tallying up all of the opportunities for a digital signal processor for a microprocessor. And incredibly the average household has over 100 micros in the products that they buy. The fact that we are both talking on this computer, it wouldn’t surprise me to find that these little video cameras also have a micro in them that is not dedicated logic, that is a programmable device. And so that is hundreds of devices. The fact is that the PCs are really the minority computer application today.

Robin: Even in our house, we have a TV, we have a number of things that hook into that, a Wii, a this or that, several different recorders, a Blu-ray and an older, and they all have microprocessors.

Randy: Absolutely, you are right.

Robin: Wow! Everything, our refrigerator is not fancy, so it doesn’t.

Randy: Well, you know, do you have an ice maker in that refrigerator?

Robin: No. It is on purpose not fancy.

Randy: Okay. All right. But it’s hard one that isn’t microprocessor controlled because of its energy efficiency and Energy Star and all of this, that usually takes some smart device, even in devices that you don’t think are smart.

Robin: You know, that’s entirely possible. I don’t know.

Randy: There is one other trend that I think connects up to all of those.

Robin: Yeah, yeah.

Randy: It is wireless. The fact that these devices are smart, it is that they are able to connect up to everything else so it comes back to this ‘internet of things’. So Digi-Key is doing everything it can to have as many wireless devices out there. And then the developer has this problem of, on the one hand, they’ve got their application, on another hand, they’ve got the internet connectivity they have to have, but then if you think about it, the next piece is the server side. And so there are solutions that customers can use to get their device on the internet without riding the server side, and those companies I think are a growing trend. I think we are likely to see more of those to help developers get their product out in the market.

Robin: So to think about it from the thing of a younger person, who is just moving into computers, IT, getting a computer science degree or whatever, that person really should be looking beyond “computers” and into everything.

Randy: That’s right. You know, when we hire applications engineers at Digi-Key, the electrical engineer is an obvious match. He is the guy who is used to designing circuits and that might range from RF, to motor control, analog, digital, all of that. But we also find great success in hiring computer engineers. These are guys who can understand that next level; there is an awful lot of software involved here too. So we employ both kinds of roles.

The computer science guy alone, that is a little bit narrow, I think. The computer engineer and the electrical engineer, those guys are everywhere. You know, I happened to have a conversation with our IT director some time ago, and I made a point of saying we were arguing about approaches to the way of solving a problem, and I said you know, there is not a single thing that IT is using that wasn’t developed by an electrical engineer. Every router, every computer, for every piece of computer science device, except for maybe the software itself was developed by an electrical engineer. So the electrical engineering student has a lot to learn today.

This discussion has been archived. No new comments can be posted.

Predicting the Future of Electronics and IT by Watching Component Demand (Video)

Comments Filter:
  • If so:
    No
    No
    No
    and No.
    But thank you for posting a summary that's nearly 50% questions!
  • ARM designs the most popular processor architectures in the world. there is an ARM core in literally billions of machines and i dont just mean cell phones. modern ARM chips run anywhere from 12 MHz to 2.2 GHz and they can scale to run much much slower to save power big time (there is an ARM chip that rivals the MSP430 chips). now with the ARMv8 arch, i think we will be seeing some serious inroads made on the server market. of course, ARM will continue to be in everything from your coffee maker to the ch

  • by Anonymous Coward on Thursday October 17, 2013 @04:58PM (#45158125)

    How the hell am I going to pay off my student loans by entering an industry moving towards decentralized. lowest-bidder IT and commodity hardware, where the labor market is global and comprised of people who have either been in the field for decades or can live on peanuts compared to you, where the brighest minds of a generation are bent on extracting pennies from stock trading algorithms, or coming up with new ways to make you look at ads, or engaged wholesale invasion of privacy.

    Do I really want to piss away the best years of my life writing code for yet another tech startup with no business plan beyond IPO, making billions for investors while getting nothing in return? To know that, in the end, I made no difference in the world?

    My advice: Make computer science a hobby, not a career.

    • Amen. If I could start over again, same circumstances, I'd be a fucking lawyer in a heartbeat.
      • These days, even the law school grads aren't doing so well.

      • by n7ytd ( 230708 )

        Amen. If I could start over again, same circumstances, I'd be a fucking lawyer in a heartbeat.

        You may be on to something there... a prostitute that also does estate planning? That's gold, Jerry!

    • by necro81 ( 917438 )
      Or, don't focus solely on computer science - i.e., being a code monkey. Instead, understand the underlying hardware and create tightly coupled hardware/software solutions - embedded software.

      For instance, the video discusses GaNFETS, and the new power density they enable. Become a controls engineer and you can end up using these devices to make world-class power supplies (not to be confused with wall warts) that are used in electric vehicles, industrial robotics, and renewable power. There is still
  • by HeckRuler ( 1369601 ) on Thursday October 17, 2013 @05:05PM (#45158195)

    A big question college students should be asking is, "What IT and electronics knowledge will be most in demand five or six years from now?" In these fast moving niches, an answer is almost impossible to come by

    Actually, I believe there are good solid answers to this one that have been true for a decade and will likely be true for the coming decade.

    First off kid, you have to understand that there are a lot of fields you seem to be lumping together. There's a difference between code-monkeys, sysadmins, network engineers, electrical engineers, embedded engineers, and web-devs.

    For any programmer there's a big question of which programming language to learn. This is something that induces flame-wars and strong passions because everyone has an opinion and their own choice is best. This is because it's an inverse tragedy of the commons, everyone wants you to learn their language because it benefits them and their language to have more users. But a binary search tree is a binary search tree in any language. Some are more verbose. Some are cludgy. But if you understand binary search trees, or whatever, the language used to deal with them by and far doesn't matter. Knowing the syntax of a language doesn't make you a good programmer. Knowing how to use the language to accomplish meaningful tasks, that's what's important. It's a little easier if you learned C rather than IBM RPG back in the day, but if you could learn RPG, you can pick up C without serious problems.

    For Web-devs, they'll fret over... let's say... which CMS project is better: Joomla, Sharepoint, Drupal, Django, Wordpress, yaddayaddayadda. Conformity is nice and picking one is important. But you're a COLLEGE KID, when you graduate you'll know what goes into a CMS, theoretically how to make one, and how they work. If you just wanted to learn how to turn it on, you should have gone to a tech school. They'll hold your hand and read the manual with you.

    (By the way I also have a thing against "certification". It might make sense for the sysadmin types, but a cert on a programmers resume is a net negative.)

    Sysadmins, network engineers, and the hardware guys all probably have similar stories. There are common tools out there you should know, but god knows everyone and their brother make a version of it. Try not to tie yourself to one particular set of tools least you suffer from over-specialization.

    tl;dr: It doesn't matter what specific component, language, framework, or gadget is popular in 6 years. You're in college, not a tech school. Learn the basic fundamentals of your field and whatever the hip new thing is will fall nicely into place and you'll understand what it's doing and what's going on. You need to learn how to use a hammer and nails to build things, not fret over which hammer is the best bet.

    • by fermion ( 181285 )
      Or, you shouldn't be going to college for a specific job. While many jobs require a degree or even a very specific degree, paying 100K to a university so one can be a code monkey may not be the best thing to do. The reason to go to college, to get a degree, other than the fiction that a piece of paper will inherently get you a better job, is to become well rounded and, well, educated. I know too many people who put those four years into building their technical skills and achieving what they really wante
    • a binary search tree is a binary search tree in any language

      I hope employers still see it that way. In years gone by, I interviewed people for software positions and often didn't bother to ask what languages they knew (or if I did, I didn't consider it important). My attitude was that if you can't quickly become proficient in a new language, we made a mistake in hiring you. These days though I see a lot of job ads for "Language Du Jour Programmer". Say all you want about code monkeys, but if employers want experience w/ a specific language, what can you do?

      Some are cludgy.

      Unfortuna

  • by frovingslosh ( 582462 ) on Thursday October 17, 2013 @05:13PM (#45158269)
    Over three decades ago I worked for a minicomputer manufacturer (sometimes known as Data Who?), in Field Service support and later in Systems engineering. Those of us in the field were able to put together a very good idea of what new products were going to be released, not from listening to the rumor mill, but by looking at the parts lists that were being published internally and seeing what components were being bought and assigned internal parts numbers. It's amazing what you can learn about supposedly secret projects just be seeing what the company is buying.
    • The KGB used to do similar things. If you want an idea what new project XYZ Aerospace is working on, just check the job ads.

  • I was particularly struck by the cost per unit he cited for 32-bit processors: $0.49/processor. At that cost profile the possibilities for DIY swarm and fabrication projects is compelling; a vision of autonomous mesh nodes spreading throughout our cities, powered by ambient backscatter chips, and forming the ultimate redundant network danced through my head.

    Exciting times.

    • by Dzimas ( 547818 )
      In all fairness, you get a pretty basic processor for 50 cents - an ARM Cortex M0 core running at 30 MHZ with 4KB of flash memory and 1K of SRAM in an 8-pin package (52 cents, quantity 5000). I am constantly amazed when pricing out designs and discovering that the quite capable little MCU that I budgeted $3 for now costs a mere $1.20.
      • by unitron ( 5733 )

        ...I am constantly amazed when pricing out designs and discovering that the quite capable little MCU that I budgeted $3 for now costs a mere $1.20.

        And if you weren't such a blabbermouth about it you could have pocketed the difference and no one would ever have been the wiser.

        : - )

    • Digikey is amazing. Use them all the time for Protos and some production. Where tracking Digikey fails is the roll of truly innovative stuff barely out of the lab in small production as we'll a the other end custom ic. Sure they do fpga but try buying a full intel chip set, a GPU or what ever Qualcomm is selling to phone makers. They really are more trailing than leading edge. And the county airport is being expand to handle larger federal jets just for Digikey. They probibly have several million skews and
  • Resistor sales are up. I don't know what that means for electronics and IT, but I predict the future will be warmer.
    • It means some people finally figured out Ohm's Law. If they get the hang of complex numbers, capacitor and inductor sales will go up too.

    • by mysidia ( 191772 )

      Resistor sales are up. I don't know what that means for electronics and IT, but I predict the future will be warmer.

      The borg will not be pleased. Resistance is futile

  • Analog electronics (Score:3, Insightful)

    by rlh100 ( 695725 ) on Thursday October 17, 2013 @07:49PM (#45159739) Homepage

    If I was a young double E student I would focus on analog electronics. Designing analog electronics is a dieing art. And it is art as much as electronics. Simulation only goes so far. Then you need to know the tricks of design and layout.

    The old school analog electronics engineers are retiring and there is not a new crop of young engineers to take their place. While more and more things are going digital we will always need analog electronics to interface with the real world.

    Analog electronics will become a specialized niche that will command big bucks. Kind of like COBOL programming. Neither of which are very glamorous but both of which are all around us.

    • by unitron ( 5733 )

      ... Designing analog electronics is a dieing art...

      As is spelling, it would seem.

      : - )

      But seriously, if you've got the kind of brain suited for it, not just analog, but the voodoo known as RF in particular should keep you in demand.

  • ...hope they (and Mouser) keep a good stock of Low ESR capacitors for some time to come, due to "capacitor plague".

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...