Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware Hacking Open Source Hardware Build News

Arduino Goes ARM 144

mikejuk writes "The whole world seems to be going in ARM's direction. The latest version of Windows 8 will run on ARM processors, Raspberry Pi is a $25 ARM based machine and now the open source Arduino platform has a new member — the ARM-based Arduino Due announced at the Maker Faire in New York. The Due makes use of Atmel's SAM3U ARM-based process, which supports 32-bit instructions and runs at 96Mhz. The Due will have 256KB of Flash, 50KB of SRAM, five SPI buses, two I2C interfaces, five serial ports, 16 12-bit analog inputs and more. This is much more powerful than the current Uno or Mega. However, it's not all gain — the 3.3V operating voltage and the different I/O ports are going to create some compatibility problems. Perhaps Intel should start to worry about the lower end of the processor world."
This discussion has been archived. No new comments can be posted.

Arduino Goes ARM

Comments Filter:
  • I dig this idear!

    FTA (bold = my emphasis):

    We’re going to be demoing the board and giving away some boards to a selected group of developers who will be invited to shape the platform while it’s been created. After Maker Faire, we will begin selling a small batch of Developer Edition boards on the Arduino store (store.arduino,cc) for members of the community who want to be join the development effort. We plan a final and tested release by the end of 2011

    I can't wait to experiment with these puppies.

    • Comment removed based on user account deletion
      • by Zerth ( 26112 )

        Um, this ARM chip is replacing an AVR microcontroller. No x86 here.

        • Comment removed based on user account deletion
          • ...they do some cool rocketry stuff which that little board fits nicely into...

            Could I twist your arm into sharing? Pretty please? I've been looking at Arduino and Raspberry Pi for a few projects along those lines, and I'm really, really curious what others are doing along these lines (and how, of course).

            • Comment removed based on user account deletion
              • Cool...thanks for the info, and good luck with your project -- and I hope the new, green kid comes up to speed quickly ;)
                • Comment removed based on user account deletion
                  • Yeah, I had wondered about using an Arduino to deploy the parachute. I've been watching YouTube videos of a guy who builds KNO3/sugar rockets (his user name is solidskateboards, I think...something like that, anyway) and he was trying to use springs, steel balls and relays to trigger the parachute release. He had all kinds of problems working out the bugs (I think it was a bounce problem), and destroyed a number of rockets before he finally got it to work correctly. Fortunately, none of them were carryin
      • If you're running WIndows and one of your processes explodes, like Firefox seems to almost daily, and starts trying to burn up the entire CPU, it used to be a real problem, because the user interface became unresponsive. With dual-core, what usually happens is that one core gets burned up by Firefox, but the other one's available for other processes, like driving your UI or running mail or killing Firefox.

        • If you're running WIndows and one of your processes explodes, like Firefox seems to almost daily, and starts trying to burn up the entire CPU, it used to be a real problem, because the user interface became unresponsive.

          We run several single-core "home theater" machines for web browsing & videos, all running XPSP3 and Firefox, and I can't remember when we last had a hang like that.

          Perhaps last winter or spring, and almost certainly caused by Flash, not Firefox.

          Anyway, Ctrl-Alt-Del and force quit is a quick&easy solution. No need for multiple cores.

          • It's probably Javascript - I'd normally blame Flash, but that's generally wrapped up in Plugincontainer,exe these days, and the CPU count shows up in Firefox.

          • by k31 ( 98145 )

            you are both right.

            Flash ( which runs within firefox) can cause this behaviour, and it can make Celerons and such unresponsive. Single cores with hperthreading and AMD based machines seem less affected. Must be a flash/windows quirk.

            Even ctrl-alt-del may fail to cause a response in a timely manner, if a process "explodes".

            However, mutiple cores are really more useful for transcoding and other "geeky" stuff. As more people rip their DVDs to save them from teh scratches of their kids or the clumsiness of thei

            • However, mutiple cores are really more useful for transcoding and other "geeky" stuff. As more people rip their DVDs to save them from teh scratches of their kids or the clumsiness of their friends , or just common wear and tear, those cores will keep being more important.

              Of course, they are business reasons for using them, too....

              Agree with the first sentence. Video transcoding to h264 is the most processor-intensive task that any consumer is likely to throw at their cpu today.

              As for ripping DVDs, a Pentium III can do this, no sweat. It's an easy job for pretty much every consumer PC made in the last decade. Transcoding to "modern" codecs is the hard part.

              Business reasons? Not for the vast majority of business users. Unless they are forced to use a Microsoft OS (eg Vistam, Win 7) that is ~designed~ to overburden the hardware with no

              • by k31 ( 98145 )

                Yes, a lot of business users are forced to use a Microsoft OS...

                That's what I meant by business reasons. Their apps only run on Windows, or are bespoke, inefficiently written apps that require multiple cores just to run acceptably, and due to office politics, they insist on using them, and other things that happen in business which don't enable progress but yet occur in the "real" world.

                Also, sorry, I more meant to say Blu-ray ripping and transcoding to h.264/mp4 and stuff, but then, you know, I got lazy.

      • X86 has a place, ARM has a place. When you need to do some heavy number crunching, or want huge detailed 3D worlds, or need to deal with a crapload of data? Then x86 is your guy.

        If you have no idea what you need the processor to crunch, then yeah, a high-power general-purpose CPU is "your guy".

        But if you think about it beforehand and realize that your device needs to do 3D rendering, or H264 video decompression, or any of a number of common high-horsepower tasks, you can augment the CPU with custom silicon that beats the stuffing out of a general CPU, with less power.

        That is exactly what the current crop of handheld devices do. They have special-purpose silicon to do the heavy lift

      • your a tad bit off.. the cortex a8 has the same performance as the Pentium 3 line. considering the amount of people who buy tablets i would say that is just near enough computing power that many people actually need. with the cortex a9mp and a15mp coming out the performance of a arm chip will be on parity or damn close to it to x86 on just about everything BUT gaming. in fact, looking at the omap4(texas instruments) version of the a9 which is used widely today. and their future omap5 which will have the abi

      • Oh, for fuck sake, hairyfeet! Why don't you claim that your glorious masters will make Windows for Arduino, and therefore everyone should cease all development lest they will be wiped from the face of Earth by Windows 8 Arduino Edition? Your task is to discourage all development on non-Windows platforms, work on it, dammit!

      • by PCM2 ( 4486 )

        X86 has a place, ARM has a place. When you need to do some heavy number crunching, or want huge detailed 3D worlds, or need to deal with a crapload of data? Then x86 is your guy. Need to be ultra mobile, where every microwatt counts? Need something light and rugged where it doesn't need fans, or a mil spec toughness for harsh conditions? Then ARM is your baby.

        You seem to be forgetting Intel's Atom line, which is x86 and is the part series that's going after ARM (albeit even the Atom chips are only slowly catching up where power consumption is concerned). Intel partners are claiming you can expect to see Atom-based phones "real soon now."

      • As a retailer I can tell you why the sales of x86 has gone down, its because for most folks what they have is more than good enough for the tasks they are doing.

        Hell, for 99% of those people 64 bit is overkill. Most only need the power of a Pentium, maybe a Pentium II.

        Of course to run their OS you might need more horsepower...

    • I can't wait to experiment with these puppies.

      Marginally OT, I suppose, but every time one of these stories comes up, I get tickled and start looking into getting started playing around with these. Then I see all the different types and realize that I don't know WTF the difference is or even how to get started and give it up. So this time, I'll ask. :)

      Any advice for a CS type to get started messing with the things?

      • by ah.clem ( 147626 )

        You can pick them up pretty cheaply; I got mine on Amazon for about $25. There is a "Getting Started With Arduino" book in PDF format linked from the Arduino website. There is also a pretty good book on Amazon which is an Arduino projects book (don't recall the name right off) but it's not bad. You'll probably want to pick up a small breadboard at RS or on-line; there are a few companies that have put together "kits" of parts to sort of go along with the book, but you can get the parts much cheaper at yo

        • Thanks, I'll grab the PDF onto my NC, so I have something to read. Does it go into the difference between the boards? That's what keeps catching me up. Do I want an Uno, Duemilanove, Nano, or Mega? The "product description" fields for both appear to be identical, and I think, but am not sure, that the Mega is more than I need to start. Do I need to worry about "shields" at first, do you think?

    • by makomk ( 752139 )

      It's actually even better than the summary suggests: the SAM3U range of microcontrollers also supports high speed USB 2.0 (they're pretty much the only cheap ARM microcontroller that does actually), which means not only do you get a whole bunch of I/Os for peripherals and lots more processing power, you can also communicate with the host PC at several hundred Mbit/s.

    • Me three, FTW !
  • Can someone explain the difference here? I mean, will this allow bigger / better projects? I'm not a programmer and only vaguely familiar with some of the projects done with Arduino. I love the idea of a upgrade as much as the next person, but I was wondering if someone could give me a context of what can be achieved here that couldn't have before?
    • They will have more:
      processing power
      ram
      program space
      I/Os
    • by LoRdTAW ( 99712 ) on Monday September 19, 2011 @02:38PM (#37447190)

      This isn't a replacement for the 8-bit ATmega based Arduinos but a step up up for those looking for more processing power. The Arduino toolkit and IDE is very user friendly and but the ATmega has its limits. Very small ROM and RAM is one problem the other being low clock speeds and its 8 bit architecture. One could make a pretty powerful robotics controller or even game console out of these boards.

      A port to ARM for the Arduino toolkit had been long talked about and wanted by many. There are attempts like the Maple but this is an official ARM-Arduino board with official support in the arduino toolchain.

      All I know is I am glad that this has finally arrived. For a while I was using the mbed for playing with arm project stuff. A great little development board but it lacked an open and offline compiler. they also left out allot of the I/O due to the DIP nature of the board. The one thing it looks like it has over the ARM-arduino is Ethernet which the SAM3U appears to lack.

    • Some things that come to mind....

      ADCs samples with more bits (12 bits vs 8 bits). This is would be important to someone who cared about getting better dynamic range from an analog signal. It is roughly a 6 dB improvement per bit. (effectively a bit less when considering non-ideal things such as clock jitter). Possible applications would be SW defined radios

      Clock speed is faster than current Arduinos. If you were running something that was computationally intense and had a small window to complete this
      • by Andy Dodd ( 701 )

        It does have 32 bit registers.

        An example of a benefit of this are softPWM implementations - on AVRs, softPWM with greater than 8 bits of resolution is a real bitch because once you go above 8 bits, the mathematics slow down a LOT. I worked around this once by having a fast 8-bit PWM loop that was dithered every PWM cycle by an outer sigma-delta modulator loop. I would've been able to do straight softPWM a lot easier with 32-bit registers.

        It's also clocked at 96 MHz, significantly higher than the 16 MHz of

        • by Anonymous Coward

          With AVR, modifying a single bit requires a read-modify-write operation. In the CM3, each register and memory location is aliased to 8 memory locations, each representing one bit of the aliased location. If the Arduino IDE takes advantage of this trick, it means that you won't pay the heavy I/O penalties Arduino is (or at least was) notorious for.

          That was entirely a function of Arduino software suckage, not AVR. The AVR instruction set has "sbi" and "cbi" instructions to set/clear a single bit in an I/O register. They take 2 cycles to execute on most AVRs, or one cycle on the newer faster ones.

          I don't remember the exact details because it was years ago, but I looked into why Arduino was so bad and there was some kind of screwy ideological resistance to structuring their software stack so they could take advantage of lightweight IO on AVR. My over

      • The TL;DR of this post is basically self-driving robot: 8-bit MCU. Self-flying robot: 32-bit MCU. Don't mess up the analog stuff.

        Of course the ARM has 32 registers everywhere. That's kind of the point. It also has 32-bit arithmetic units, including division I presume (but don't quote me on that). 32-bit arithmetic units will come in handy in a lot of advanced applications such as signal processing and control systems where you're often working with more than 8 significant bits of data. (Caveat: As always, t

    • ARM is a full modern von-Neumann 32-bit CPU (though maybe this version is stripped down w/o an MMU). AVR is a very small 8-bit Harvard architecture. (Harvard architecture confuses things since the instruction word size can be bigger than the memory word size, thus 8-bit registers but instructions are 16-bit). The ARM version also has a much larger chunk of RAM whereas the AVR was extremely limited (not sure what arduino used but some versions only have 256 bytes total).

    • After we're done slashdotting arduino.cc [arduino.cc], go take a look around. Arduino makes an open hardware and software design for an 8-bit microcontroller board with a bunch of pins for analog and digital input and output, with a friendly C-based integrated development environment. Even if you're an artist and not an electronics engineer, it's a friendly easy-learning-curve environment for building electronics that respond to sensors, and taking technology that used to be opaque magic and turn it into transparent c

    • Biggest difference will most likely come from ram, CPU speed will help with number crunching for some algorithms but going from 2k to 50k RAM will actually allow you to run a proper TCP/IP stack, store the image from a camera so you can process it immediately or save for later use or keep decent sized data sets in memory for machine learning algorithms.

      Very good candidate for a central controller for a project. I doubt that it would be as cheap to get the chips vs the ATMega chips.

  • Encouraging Overkill (Score:3, Informative)

    by MrOctogon ( 865301 ) on Monday September 19, 2011 @02:23PM (#37446948)
    As somebody who is looking for a more powerful prototyping platform on the cheap I look forward to this. But I would not use it for a majority of my hobby projects, which do not need a lot of this power.

    Most arduino projects only use a few I/O pins and very little processing power. Many hobby projects could be made with a much weaker pic processor, and many could get by on the basic 8 pin pics. Many people don't know that the simpler solutions exist, because they only see arduino stuff all over the web. The full development board is way overkill.

    Additionally, with current arduino setups, it is fairly simple to make a clone around an ATmega chip. All parts are soldered easily through hole, and the schematic is easy. With a 32 bit surface mount chip, the schematic gets complex enough that most hobbyists are now scared off by the hard soldering and the crazy layouts. The open source, easy to clone nature of Arduino that made it what it is today is incompatible with the new high-end boards, and people will have to pay more for the official dev boards, or something else professionally fabbed.
    • by drolli ( 522659 )

      Many, maybe most, but as soon as you want to do something which requires a little signal processing, you appreciate more computational power than the AVRs provide.

    • Unfortunately, many of the sensors take up a whole bunch of pins. You get a simple LCD and over half your pins are used depending. This becomes an issue when you're trying to make something like a mobile robot with a touch LCD screen. Then you want fully articulated arms, you're going to need more pins for each servo. You could always offload them to their own processors, but I think there is a market for these uber Arduinos that will allow someone to not worry about how many pins they have left.

      I don't thi

    • Arduino's current Uno design burns an entire Atmega chip converting USB to less-useful serial, matching what they used to do with a more specific serial chip, and it's difficult to really use the USB through that interface. The new Arduino Leonardo follows on that by using one of Atmega's newer chips that does USB functions as well as general processing in one chip, so it makes it easier to do a lot of the functions they mentioned, like keyboard and mouse protocols. The catch is that all of Atmel's AVR c

    • Overkill is great for prototyping. The whole thing about premature optimization goes for hardware as well. Besides costs per chip doesn't really increase much, and probably neither will power consumption (since the chip finish its work faster and can then sleep longer).

      I'm with you on the hard soldering though. I wouldn't want my worst enemy to hand solder a QFN64 :/. I still hope a smart guy will come with some cheaper home PCBA tools. Then again I thought most Arduino people used professional dev boards.

    • by makomk ( 752139 )

      Pretty much anything that involves networking is beyond the limits of the Arduino, though; the official Arduino network shield actually has a seperate ARM processor running code from WizNet that implements the TCP/IP stack. Also, anything that involves highish-speed I/O to or from a host computer is a pain because on classic Arduino you're limited to a USB serial link.

  • And this goes nicely with the Microchip PIC32 (MIPS-based) Arduinoid chipKIT boards that went public at the Bay Area Maker Faire: http://themakersworkbench.com/?q=node/421 [themakersworkbench.com]
  • by Dunbal ( 464142 ) *

    Perhaps Intel should start to worry about the lower end of the processor world.

    This has always belonged to the likes of Motorola and others. Comparing ARM to x86 is like comparing apples and oranges. At one point people will wake up and realize that after reading a book and playing Angry Birds, tablets are rather limited. I know a lot of people suffering from buyer's remorse after buying tablets. They have their uses, they have their niche market, but at one point when you make a tablet do everything a laptop or desktop does, you end up with a (bad) laptop or a desktop.

    • by bucky0 ( 229117 )

      So, this is me not knowing the field very well, but why wouldn't Intel need to be worried about these devices? A significant part of the x86 market is people running simple word-processing, web-browsing applications that don't demand a lot of CPU power.

      Additionally, (and it's probably worth a different thread) but why doesn't intel just release ARM processors? If Microsoft is releasing an ARM port of Windows8, theoretically, a lot of people (at least the big guys) will be porting their applications to ARM a

      • why doesn't intel just release ARM processors?

        Intel released the XScale to Marvell back in 2006. Is that what you mean? Besides, Itanium has better predication.

        I recall maybe ten years ago people talking about how everything was going from C to just about anything else. But the reality is that C/C++ is still on the ground floor doing the heavy lifting.

        I suppose if you're stranded on a ship in the doldrums, you shake your fist at the sky above rather than the ocean beneath, which makes perfect sense unti

      • The comment about Intel is entirely unrelated to this Arduino version. It's still vastly too small to be used in a tablet or smartphone. ARM can be used for something much more powerful but not this particular board. You might find this CPU as just a peripheral on a full tablet.

        Intel already does higher end ARM chips of the type you'd expect to see in a tablet. Intel is a big company, they don't only do x86 chips and PC motherboards.

    • Perhaps Intel should start to worry about the lower end of the processor world.

      This has always belonged to the likes of Motorola and others.

      Except...an entire generation of engineers cut their teeth on the Intel 8051.

      Does Intel still collect licensing fees from other manufacturers who used 8051 intellectual property? I dunno. But they definitely were a player in this space, directly or indirectly, at one point.

  • I don't believe Raspberry Pi is available yet (see http://www.raspberrypi.org/faqs/ [raspberrypi.org], first line), but I'm seriously considering picking one up when it's available. Arduino is definitely an amazing platform, but I just don't need something quite that low level. I'm currently toying with turning one of those cheap $10 "cameras" into a wireless surveillance cam. As far as ARM goes, I'm not deep enough into the hardware to care either way anyway.

  • Intel is making plenty of money, but they definitely see ARM as long term threat, which is part of the reason they've been focused on power consumption and performance per watt the last 5 years.

    • by Arlet ( 29997 )

      Of course, this project has a low-end ARM. It is far away from any market that Intel is interested in.

      The higher end ARMs that are used in tablets and phones are monsters compared to this one, and not really interesting for hobby use anyway. They are usually only available in BGA, and depend on external memory chips.

      • It's true that this isn't a market that Intel has shown any interest in. However, this points to one of the fundamental differences between Intel and ARM. ARM already has years of mastering low power designs. For them to stay competitive means they just need to keep figuring out how to make their designs "fast enough", and work with their licensees to make them scale down to smaller process nodes. That's mostly a technical challenge, and it's one that the ARM licensees do a lot of the work on (CPU frequency

        • by Arlet ( 29997 )

          I think Intel's job is easier. The changes required for high performance require very complex design to be added. Intel already has that design. ARM is still working on it.

          To get lower power just means a combination of better manufacturing (equally hard for both), and removing transistors, which is much easier than adding them in.

          • You have no clue about what makes a low power design. If it were that simple, ATOM would be 2x as fast and 1/10 the power. Maintaining performance at low power requires completely different designs than high power high performance. Why do you think Intel abandoned the P4? The Core architecture is derived from the P-M, which is derived from the P3. They took the bus interface and a few other things from the P4, but the P4 execution architecture was thrown away in favor of the much more power efficient P3 der

  • Really? They should start to worry only now? If I were them, I'd be right in the middle of worrying, not simply starting to do it.
    • by dbc ( 135354 ) on Monday September 19, 2011 @04:10PM (#37448774)

      Sure, and Intel has been worrying for over 15 years. But here is the thing... the #1 thing that matters at Intel is gross margin per wafer. Intel fills its fabs, and runs them throttle to the firewall 24x7. Every project is ranked by gross margin per wafer... fall below the cut-off line, and you either buy fab from somebody like TSMC, or go find a different project to work on. The Intel Atom is a successful attempt to create a power efficient part that meets the gross margin per wafer test. Go look at the margins of the ARM makers. I'll bet it doesn't match Intel's.

      I overheard a very interesting+insightful conversation among vendors at the ARMTech conference a year or so ago. "We are all just vendors of value-added flash. Look at the die photos. It's a whole lot of flash memory, with a little bit of logic around the margins for a processor and peripherals in order to differentiate our flash from the other guys' flash and add some value."

      Intel is doing what makes business sense for Intel. But they are watching. And Intel, big as it is, can turn on a dime, and has enough fab capacity to pave over with silicon any competitor that gets in it's boresight. That said, in the space where I work (embedded) ARM is taking over the world. It really makes zero sense to use an 8 bit uCtlr just about anywhere anymore, when you can get an ARM in the same size package and at nearly the same cost. Since flash dominates the die area in a microcontroller, 8-bit versus 32-bit logic is noise -- it has less cost impact than the package. There are a lot of Cortex-M3 parts in 48 pin packages now that cost only slightly more than 8 bit parts. (I should point out that there is huge difference between, an ARM Cortex-M3 and an ARM-A9, for instance an MMU.)

      In the end, it comes down to MIPS and MFLOPS, and the die area and power required to do that much computation. When an ARM has enough functional units to match the MIPS and MFLOPS of an x86, it will take as much die area and power. At the complexity level of a Pentium IV, the added ugliness of the X86 instruction set is pretty much noise in the total die area and power. (In a past life I was an instruction decode and pipeline control logic design specialist -- I can tell you that x86 instruction decode is as ugly as it comes -- and in the day and age of out-of-order execution, that almost doesn't matter, except that because of all that ugliness x86 code is freakishly dense, which means the same size I-cache holds a lot more useful code. When you toss in the fact that the ugliness is also guarantees employment for instruction decode specialists, I'd call that a win :)

      • by renoX ( 11677 )

        > because of all that ugliness x86 code is freakishly dense, which means the same size I-cache holds a lot more useful code

        Because of the ugliness?
        No: some RISCs have (quite ironically) become variable instruction length CPUs using 16&32 bit instructions (ARM Thumb/Thumb2, MIPS16) which makes them competitive with x86 in code density, yet they still have a much cleaner ISA than x86.

  • five SPI buses

    I read that as five spouses and thought: Fuck, what is it doing? Making a harem?!? Then I realized two things: We're talking about a mobo and second, I need more caffeine.

    • That's SPIES, not spouses :-) Actually, no, it's not a mobo in the sense you're thinking of. It's a microcontroller board with a bunch of analog and digital inputs, though you can now support some higher-end input devices as well. SPI is a simple bus for chips to talk to each other, typically used for things like a D/A converter or accelerometer chip or EEPROM to talk to a controller chip.

  • ARM's relaxed/weak memory consistency model is not an issue for non-concurrent programming since the compiler takes care of you. But when you're making parallel programs and you're concerned with performance and using lock-free algorithms, ARM is a nightmare in trying to figure out the appropriate memory barriers yet not overuse it and kill performance. X86 has much more limited reordering and even there it takes some serious thought in making sure your lock-free code is correct.
  • Xilinx is marketing Zynq, a new line of chips containing ARM-9 with embedded FPGA peripheral [fpgajournal.com]. Arduino is remarkable because its HW design is open-source, and reproducible without a license. So is there an Arduino "IP core" that can be configured on a Zynq FPGA, and then Arduino code run on that?

    How about using that Arduino in FPGA on ARM to replicate the functions of this new Arduino/ARM chip? Would code targeting the Arduino/ARM chip "just work" on the Arduino/FPGA/ARM chip?

    • An AVR core for FPGA already exists [gadgetfactory.net].

      • That AVR8 core is designed to run on "Butterfly One", which is a Spartan 3E. Which I think means that it would give me an Arduino running on the Zynq's FPGA, right? Except I'm not sure the Zynq offers the analog inputs the AVR8 expects. If I added the missing HW to the Zynq, would the board support all Arduino SW?

        • Well, sure, but I have no idea why you would want to pay for that privilege. The Zynq is in a totally different weight class: dual 800MHz ARM9, DDR2/3 interface, gigabit tranceivers, 1Msps ADCs. All this hardware to replicate the functionality of a 20MHz 8-bit AVR? Why?

          • Because we're currently running a PIC as a peripheral to an x86 PC in an embedded industrial control app. Zynq/Android might replace the PC, some networking equipment, and the PIC, not just the PIC. If I could use Arduino developer output, and Arduino developers, I'd have more tools and developers to get me there.

            Plus maybe let us port sequential code from PIC MCU to the CPU, gradually as we have time, eventually leaving only the parallel circuits and stuff the PIC did that the CPU shouldn't. Maybe test out

    • by Zarquon ( 1778 )

      Atmel did something similar a few years back with their FPSLIC, but the tools and parts were very expensive and it's more or less dead. It looks like the Zynq has a similar problem... the lowest end part is $15+ for high volume sales.

      • FPSLIC [design-reuse.com] was an 8bit AVR with FPGA up to 2300 cells. All the AVR did was manage the reconfig process. It doesn't look like an app on the AVR could actively invoke logic on the FPGA. The Zynq boots into Linux and leaves the FPGA as a peripheral for circuits to deliver data or interface with external HW. It can run Linux and its app processes on one core, leaving the other core "raw" for running processing directly, or embedded in the FPGA to optimize circuits factoring out ARM instructions from FPGA.

        You think

  • The specs on this seem to be better than what Digilent's UNO 32 is:
    Microchip® PIC32MX320F128 processor
    80 Mhz 32-bit MIPS
    128K Flash, 16K SRAM
    42 available I/O (I think that includes 16 analog)

    The price of this is also $25 and the software is based on arduino's code. The voltage is also 3.3 like the announced one instead of the UNO's 5.0V.

    Personally, we will probably be sticking to Digilent as the company is about a 30 minute walk of where I work (which saves on shipping costs).

  • I don't think that Intel will be worried that something moved from one non-Intel platform to another non-Intel platform, especially in a market segment that Intel does not compete in. Those Atmel/ARM chips run under $5 per chip, soemtimes under $1. Intel has the Atom chips around $50 at the low-end.

    • Seriously. Besides this being a zero profit item, Intel is still getting over that horrible beating they got from the PowerPC challengers, arent they?

      Even if it does offer comparable capabilities and a price or performance benefit, theres too much inertia behind x86.

      I mean, Google+ is arguably better than Facebook, so why isnt everyone on it instead of FB?

      Oh and lets all listen raptly to the people who hum the "I'd be happy doing web browsing and basic office stuff on a 95MHz chip!!1!". Yeah sure. Thats w

  • What the heck is Arduino?
  • Twice as fast. Almost 4 times as much RAM. Cheaper.

    The only downside is that it's BGA, but if somebody else puts it down on a board for me, that's sweet.

  • Hooray for 3.3v (Score:4, Interesting)

    by Sleepy ( 4551 ) on Monday September 19, 2011 @04:29PM (#37449052) Homepage

    mikejuk's submission paragraph states: "However, it's not all gain — the 3.3V operating voltage and the different I/O ports are going to create some compatibility problems. "

    I respectfully disagree. Firstly, there are already a lot of 3.3v based Arduinos on the market. I own a JeeNode (see Jeelabs in EU, Modern Device in the USA). The JeeNode can run a 434MHz wireless radio transceiver and temperature sensor for MONTHS on a single 3.3v boosted AA battery. You could not do that with 5V.
    Adafruit has a tutorial on converting Arduino Unos over to 3.3v, from 5v. It's popular.

    Mostly all sensors these days are 3.3v.

    But most most actuators (like stepper motors) require MORE than 5V. Sure, there's some relays requiring a mere 5v.. and very few work on 3.3v... but most relays require 6V or higher. The usefulness of 5V is diminishing, so what you really want is just enough power to activate a transistor or relay.

    (Some Arduino compatible chips run great at 1.8v, and sensors do also... there will come a time someday where it may make sense to run at less than 3.3v)

    I see Arduino more as a collection of standards and open hardware. There are dozens of Arduino designs all of which vary slightly in terms of electrical and physical (pinout, etc) compatibility. But this too is a good thing... the Arduino platform is all about ADAPTABILITY.

  • Does this have proper 2.5mm pin spacing throughout?

    The most annoying thing about the regular arduino is the fact that you can't use standard protoboard for home made shields.

    Please tell me they have fixed this problem.

    • by daid303 ( 843777 )

      The 'off' spacing is intentional, so you need to buy shields instead of make them yourself. You're better off with an arduino Nano if you want protoboard compatibility.

      • Incorrect, and a reminder that Hanlon's razor is not just a nice quote:
        "Never attribute to malice that which is adequately explained by stupidity."

        The pin spacing was an innocent error, not some Machiavellian scheme to ensure the profitability of shields:
        http://www.arduino.cc/cgi-bin/yabb2/YaBB.pl?num=1212632541 [arduino.cc] (post 13)

        The Arduino guys have pretty much thrown open their doors to the world and said, "here's everything we do and how to build it yourself". Why do you see negative when this sentiment is over

  • There is one problem with this. Yes, you can prototype on arduino very well, but suppose you tested everything and want to deploy your project on a dedicated board. I can't see hobbysts soldering QFP packages and making 3+ layer PCBs at home somehow. Compare that to soldering a DIP socket on a possibly one-sided board.

    Maybe one project - one arduino isn't a problem for some. For me it's way to pricey and not a good solution to add a quite big board with a lot of redundant circuitry to every project.

    Nex
    • by daid303 ( 843777 )

      The Arduino Mega line is already hard to home solder (not impossible but hard) so many people just throw in the Arduino as a whole. (They are even doing this with the DIP versions) This ARM is just the next step up.

  • First off, the SAM3U is based on Cortex-M3 only, which can't run the full 32-bit profile of the ARMv7 instruction set... rather, it is exclusively capable of executing the Thumb-2 instruction set, which are 16-bit instructions with a handful of 32-bit instructions. This is misleading, since it means that conventional ARMv5 or even ARMv7 code targeted towards the Beagleboard (say) will not work on a Cortex-M3 part.

    Second of all, why the hell would Intel have something to fear from a 96 MHz Cortex-M3 part?

  • Now that 24 bit 192k audio is the norm for motherboards, I'd like to see this on a small cheap board BTW --- how much is the Due going to sell for?
  • An Arduino with a really great CPU gets us only part way. While more processing on the Arduino will be a fantastic jump in the right direction, it still is an issue with regards to I/O. This Arduino should have an option to solder on for example an Atmel FPGA which can be programmed by the CPU and in addition has X number of pins from the FPGA which are already level adjusted for 5V using 3.3v to 5v tri-state buffer chip.

    This is just my two cents :)
  • We at http://kidsruby.com/ [kidsruby.com] have gotten Ruby running on one of these. We showcased our work at this past http://gogaruco.com/ [gogaruco.com] vid here: http://confreaks.net/videos/637-gogaruco2011-kidsruby-think-of-the-children [confreaks.net] These are extremely lightweight in terms of processing, but we got both debian and archlinux up and running after just a bit of work. We have further optimizations planned and are very excited to be working with the RaspberryPi team.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...