Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Hardware Technology

NVIDIA's $99 Jetson Nano is an AI Computer for DIY Enthusiasts (engadget.com) 127

Sophisticated AI generally isn't an option for homebrew devices when the mini computers can rarely handle much more than the basics. NVIDIA thinks it can do better -- it's unveiling an entry-level AI computer, the Jetson Nano, that's aimed at "developers, makers and enthusiasts." From a report: NVIDIA claims that the Nano's 128-core Maxwell-based GPU and quad-core ARM A57 processor can deliver 472 gigaflops of processing power for neural networks, high-res sensors and other robotics features while still consuming a miserly 5W. On the surface, at least, it could hit the sweet spot if you're looking to build your own robot or smart speaker. The kit can run Linux out of the box, and supports a raft of AI frameworks (including, of course, NVIDIA's own). It comes equipped with 4GB of RAM, gigabit Ethernet and the I/O you'd need for cameras and other attachments.
This discussion has been archived. No new comments can be posted.

NVIDIA's $99 Jetson Nano is an AI Computer for DIY Enthusiasts

Comments Filter:
  • Slow down, cowboy!
  • by Anonymous Coward

    You have to type JETSON! to get it to do anything.

  • by Anonymous Coward

    Sophisticated AI generally isn't an option for homebrew devices when the mini computers can rarely handle much more than the basics

    That's a marketing failure: many people think cheap toys like the raspberry pi are the only game in town.
    You can get small form factor FPGA boards in the $100-200 range.

  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday March 18, 2019 @05:48PM (#58295190) Homepage Journal

    4GB puts this into the category where it's actually useful for stuff like web browsing. Sadly, the link to the item from TFA is 404, but it looks like it's actually got enough ports on it to be useful for doing stuff without needing a hub, too. Forget building robots with it, you can build kiosks. Do they have an Android build for it?

    • 4GB for web browsing???

      • 4GB for web browsing???

        Welcome to 2010, let alone 2019. The browser will commonly eat up 2GB by its lonesome. Mine is using 1.8GB right now, and that's while using noscript and blocking all kinds of crap. If you want anything else to be able to happen at the same time you're browsing, without needing to swap, then you need more than 2GB RAM — which is where most of the cheap boards top out.

        • by Shaitan ( 22585 ) on Monday March 18, 2019 @06:51PM (#58295454)

          Sad but true and there really is no good explanation for it. I just closed everything an opened chrome, clocked in at 350mb. Loaded slashdot as the only page, suddenly it skyrockets to 850mb and then slowly settles back to 550-650mb. Now there definitely isn't enough content in this page to explain 10mb and even with the linked pages you have nowhere near the 200mb it has absorbed in content. It does make me wonder just what the hell it is using so much memory for.

          Compared to the 4mb footprint of Netscape 4 or IE 3.5 you've gained what... some adjustments to javascript and css? That explains maybe 10mb of the increase. A gargantuan cache? That would explain another 64mb maybe. The actual page content? If anything the pages actually have less content with html, css, and light weight icons being the styling of choice these days it's almost all text. A 2mb page would be massive but they use shitty autochurned output for most sites these days so call it 8mb, across all the linked pages that will fill your 64mb cache. With a little breathing room added in that is what about 100mb that can be explained?

          • by AmiMoJo ( 196126 )

            Web browser memory use must be the most misunderstood topic in tech these days.

            Firstly, unused RAM is wasted RAM. If it's not used for anything else you might as well use it for cache. Modern operating systems support this, allowing applications to allocate RAM for caching but immediately release it if something else needs to use it.

            For maximum performance web browsers do a lot of caching. HTML5 itself is quite complex - the browser has to build up a "document object model" that allows CSS and Javascript to

            • by Shaitan ( 22585 )

              "Firstly, unused RAM is wasted RAM. If it's not used for anything else you might as well use it for cache. Modern operating systems support this, allowing applications to allocate RAM for caching but immediately release it if something else needs to use it."

              I mentioned caching. There is nothing new in this, there is a performance hit if there is contention for that memory which is why the default cache gets set to a reasonable value for an application like a web browser, is user tunable, and the rest should

              • This attitude needs to die, Moore's law is dead and we need to start optimizing applications again.

                What do you think that cache is being used for? Preloading functions that may be used by the JS engine. JIT compilation where possible and loading in memory so it's faster when available.

                For all the complaints about RAM usage of browsers one thing that has actually increased consistently with nearly every version is actually the raw computational performance. Page rendering has been getting faster, javascript performance is many orders of magnitude better than the days of low RAM Netscape.

                Browsers are compl

                • by Shaitan ( 22585 )

                  "Browsers are complex. They provide a functionality previously not even dreamt about, and for all the complaints they do so quite efficiently."

                  Before browsers? Sure. Since netscape? Not really. The original FF was a huge improvement and rewrite. At this point it is more of a bloated and crappy implementation than Netscape was. The only big advance you actually see being used is javascript communication back to the server. It's a browser, it needs to run rendering of a markup language, css markup, and some l

              • by AmiMoJo ( 196126 )

                I mentioned caching. There is nothing new in this, there is a performance hit if there is contention for that memory

                Yeah... I don't think you understand how this kind of caching works. That memory isn't reserved, it's used when it is free but as soon as an application wants it the cached data is wiped and it is handed over for use. The performance penalty is tiny - the RAM has to be cleared on allocation anyway so it's just a little bit over accounting overhead, for a massive overall speed boost.

                All modern operating systems support this and already do it for things like disk cache.

                • by Shaitan ( 22585 )

                  "as soon as an application wants it the cached data is wiped and it is handed over for use. The performance penalty is tiny"

                  That "tiny" hit happens millions of times, so it isn't tiny. It's degrading memory performance across the system.

                  Excessive caching is not massive overall speed boost. The vast majority of that cache is never used and almost all of that boost can be gained with a cache the fraction of the size.

                  • by AmiMoJo ( 196126 )

                    If your application is allocating memory millions of times over any significantly short timescale then you fucked up.

                • That memory isn't reserved, it's used when it is free but as soon as an application wants it the cached data is wiped and it is handed over for use.

                  That's how disk cache works, not application cache. Just the browser and [modded] FO4 will cause FO4 to run out of memory and die on my 16GB PC. Sometimes I can kill the browser at the low memory warning and have FO4 not die.

    • by quenda ( 644621 )

      What do you mean "finally"?
      Ebay, Amazon etc have a plethora of ARM-based mini-PCs with 4GB under $99. Well under.
      Typically sold with Android as set-top boxes, they can be used for web browsing, office PCs, etc.
      Some will work with Linux, but remember there is no Linux/ARM version of full Chrome.

      • What do you mean "finally"?

        I mean finally featured on Slashdot, where the last several which were slashvertised had only 2GB, or even less.

        Ebay, Amazon etc have a plethora of ARM-based mini-PCs with 4GB under $99. Well under.

        Any idea which are worth a crap?

        • by quenda ( 644621 )

          Any idea which are worth a crap?

          For what purpose? If you are just buying one for personal use, save yourself effort by spending a bit more for Atom instead of ARM.
          The software will be easier.

          • For what purpose? If you are just buying one for personal use, save yourself effort by spending a bit more for Atom instead of ARM.
            The software will be easier.

            Intel can die in a fire.

            Finding these things on Amazon is hard. The search listings wind up mixed with all kinds of unrelated garbage.

            • by quenda ( 644621 ) on Monday March 18, 2019 @08:20PM (#58295796)

              Try searching for "4gb kodi box". Then narrow down to those with an SoC that has decent drivers for your OS of choice.
              Are they any good for you?

              • Try searching for "4gb kodi box".

                Heh. I should have guessed. But I didn't. Thanks!

              • by quenda ( 644621 )

                I'm not sure why that was modded funny.

                The items sold as "kodi" boxes make good cheap general-purpose ARM PCs.
                The Kodi thing means mass-market and cheap.
                If you do actually want to run Kodi on them, it is best to wipe the supplied software, and install LibreElec.

                Obviously this lacks the AI power of the above NVidia, but thread-starter was looking for a general-purpose ARM box with 4GB.

          • The purpose varies greatly as does the capability. On the low end of both price and capability are the Rasperry Pi models. The most expensive board there is about $35 US. The cheapest is around $10. They make great little kits for things like sensors boxes and retro gaming. Obviously this NVIDIA board is far more powerful and far more expensive.
      • What do you mean "finally"? Ebay, Amazon etc have a plethora of ARM-based mini-PCs with 4GB under $99. Well under. Typically sold with Android as set-top boxes, they can be used for web browsing, office PCs, etc. Some will work with Linux, but remember there is no Linux/ARM version of full Chrome.

        I am one of the first owners of the of Jetson TK1 board. The benefits of this board are a) it runs Ubuntu out of the box, b) it has more powerful GPU, and c) it is supported and well documented by Nvidia. There is a native build of Chrome on ARM - see the Acer Chromebook 13, which also uses the TK1 SoC - and it runs quite well. However, I never did get that build for my Jetson as Google limited its distribution. There may be a way around that these days. I did however get Chromium to work like Chrome throu

        • by quenda ( 644621 )

          There is a native build of Chrome on ARM - see the Acer Chromebook 13,

          Yes, but that is not real Linux, as you discovered. It is closer than Android, which also uses the Linux kernel and has Chrome.

  • It's good that this post linked back to the original Engadget posting, but when you do something like this, you should quote it. The title and first two sentences are also word for word from the Engadget article. Not cool. Not cool at all.

  • by Anonymous Coward

    mini computers can rarely handle much more than the basics

    Minicomputer. [wikipedia.org]

    • "definition of a minicomputer as a machine costing less than US$25,000 (equivalent to $161,000 in 2018), with an input-output device such as a teleprinter and at least four thousand words of memory, that is capable of running programs in a higher level language"

      I would say that this nvidia thing qualifies (with some reservation about the teleprinter).

  • Obviously these people have no clue what is in a so-called "smart speaker".

  • by nospam007 ( 722110 ) * on Monday March 18, 2019 @06:03PM (#58295260)

    This one doesn't even have the 3 laws.

  • Mr. Fusion (Score:5, Interesting)

    by theCat ( 36907 ) on Monday March 18, 2019 @06:27PM (#58295370) Journal

    Soon people will be tinkering with personal-sized AI like they started to do with Arduino a few years back, and 3D printing more recently. The trend here is obvious, but we cannot predict what tinkers will come up with once they get their hands on these things in a big way.

    AI researchers fret about the "containment problem", meaning how do you prevent an autonomous intelligence from breaking out of your lab and doing whatever it wants to, including enhancing itself exponentially. So there is talk about creating process and protocols to contain AI similar to what you might have regarding biological containment for a microbiology lab working with dangerous pathogens. But those rules aren't going to work when anyone wants to can build a reasonably powerful AI machine using off-the-shelf components, and/or using cloud-based resources.

    I don't expect this is going to work out the way we think it is.

    • But those rules aren't going to work when anyone wants to can build a reasonably powerful AI machine using off-the-shelf components, and/or using cloud-based resources.

      Reasonably powerful? Do you mean capable of reason? Because it's likely that such will happen in a supercomputer before it happens in some hobbyist's garage.

      • Re:Mr. Fusion (Score:5, Insightful)

        by Kjella ( 173770 ) on Tuesday March 19, 2019 @01:00AM (#58296528) Homepage

        Reasonably powerful? Do you mean capable of reason? Because it's likely that such will happen in a supercomputer before it happens in some hobbyist's garage.

        Well maybe, maybe not. I mean we didn't start out as humans, before that we were monkeys etc. all the way back to single celled organisms. It might be that we're trying too hard to replicate intelligent behavior rather than find the underlying principles of intelligence. For example take AlphaZero [wikipedia.org], it was based on AlphaGo Zero which played Go. But rather than being a dedicated engine with a lot of specific programming for Go it just as easily beat the best at chess and shogi.

        Uber(!?) managed to pull off some impressive results with Go-Explore, a family of so-called quality diversity AI models. Open AI Five is constantly improving in DOTA, even though they're not playing the full game yet it's an open map with team play. Tencent is showing off some good Starcraft II play, even if they can't beat the best. DeepMind made some ass kicking Quake III Arena capture the flag bots. But is there some kind of supreme overlay that could evolve into all these AIs? I would say probably yes.

        That's really the holy grail of AI, what is like the "spark" of intelligence that learns to learn. It's probably relatively simple once we can look at it in retrospect, it's not a lot of code to do one specific task. It's some kind of general pattern to create more complex, dedicated "sub-AIs" for specific tasks like we have different brain centers in the brain. That and a few billion years of evolution, but computers can get pull that off pretty quick if we can just figure out where we're supposed to start. It'll probably start by beating us at tic-tac-toe, not any of the above.

        • I don't really agree.
          At the moment all these 'AI' are not AI. They are computer code running a comparison between a set of data and whatever is the realworld application of that data.
          So in the field, 'large advances' are when you figure out how to feed it a new type of data, and use it for a new type of task. Now Google has been doing this to satelite data for a few decades, to create realworld map data for consumption. But a newer development is that somebody figured out how to feed a comparison program wi

      • by sad_ ( 7868 )

        isn't all (so called) AI already 'reasonably powerful' enough to cause mayhem if used in a bad way?

    • This little thing is pushing 500Gflops, a mid-range GPU has several times the power. So is some rampant "AI" was going to go all DIY SkyNet it would have happened a few years ago.

  • Progress (Score:5, Funny)

    by JBMcB ( 73720 ) on Monday March 18, 2019 @07:11PM (#58295540)

    30 Years Ago:

    Motorola's new 16-bit microcontroller has a whopping 16K of RAM, and 8 GPIO! You can use it to build robots, or home automation systems, or low-end general purpose computers...

    Today:

    Our new quad-core CPU, 128-core GPU is great if you want to build a speaker!

    • by AmiMoJo ( 196126 )

      To be fair, even today a 16 bit microcontroller with 16k RAM is on the large side. In the sub 32 bit market there are a lot of sub 4k parts with 8 or fewer GPIOs...

    • To be fair today's speakers are expected to do real time room correction and beam steering while playing music standalone from an internet stream over your wifi connection.

  • by edi_guy ( 2225738 ) on Monday March 18, 2019 @09:27PM (#58296016)

    How can I make a Beowulf cluster out of these?

  • by WaffleMonster ( 969671 ) on Monday March 18, 2019 @09:37PM (#58296042)

    This is cool shit for what it is.

    What else has the ability to encode 4k h.265 in realtime /w comparable GPU at a price anywhere near what this thing costs?

    Having said that I do a lot of h.265 encoding and wouldn't touch the NVidia GPU encoders with a 39 and a half foot pole. They suck ass.

    Still at $100 the deal breaker will be what kernel and hardware support look like for this thing.

    Personally also looking forward to the N2.
    https://www.hardkernel.com/blo... [hardkernel.com]

  • An rpi 3b+ draws a max of 5w, and your heatsink is pretty optional. I suspect this is drawing a bit more than 5w.

    And the real news is when a normal person can stably run stuff on it. I've been watching this market for a few years, and the difference between the Raspberry Pi and the Raspberry-Pi-Killers is that with an rpi you can be up an running in under a half hour depending on how fast your Internet can download updates, and the result is pokey but stable (if you don't stint on the power supply). With

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...