Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Military Intel Software The Almighty Buck United States Hardware Technology

DARPA Funds Development of New Type of Processor (eetimes.com) 84

The Defense Advanced Research Project Agency (DARPA) is funding a completely new kind of non-von-Neumann processor called a HIVE -- Hierarchical Identify Verify Exploit. According to EE Times, the funding is to the tune of $80 million over four-and-a-half years, and Intel and Qualcomm are participating in the project, along with a national laboratory, a university and defense contractor North Grumman. From the report: Pacific Northwest National Laboratory (Richland, Washington) and Georgia Tech are involved in creating software tools for the processor while Northrup Grumman will build a Baltimore center that uncovers and transfers the Defense Departments graph analytic needs for the what is being called the world's first graph analytic processor (GAP). Graph analytic processors do not exist today, but they theoretically differ from CPUs and GPUs in key ways. First of all, they are optimized for processing sparse graph primitives. Because the items they process are sparsely located in global memory, they also involve a new memory architecture that can access randomly placed memory locations at ultra-high speeds (up to terabytes per second). Together, the new arithmetic-processing-unit (APU) optimized for graph analytics plus the new memory architecture chips are specified by DARPA to use 1,000-times less power than using today's supercomputers. The participants, especially Intel and Qualcomm, will also have the rights to commercialize the processor and memory architectures they invent to create a HIVE. The graph analytics processor is needed, according to DARPA, for Big Data problems, which typically involve many-to-many rather than many-to-one or one-to-one relationships for which today's processors are optimized. A military example, according to DARPA, might be the the first digital missives of a cyberattack.
This discussion has been archived. No new comments can be posted.

DARPA Funds Development of New Type of Processor

Comments Filter:
  • Thats the worst backronym I have ever heard.

    • My thoughts exactly. I'm going to request funding for a Lexicon Randomword Ocelot Longjmp Pancake Covfefe Cheesecake processor, if they funded Hotpants Illicit Vagina Exploit for $80M imagine how much they'll give me.
    • You've obviously not seen a lot of DARPA projects then. How about Clean-Slate Design of Resilient, Adaptive, Secure Hosts (CRASH), which was funded by DARPA's I2O (information innovation office - no, I'm not making this up).
  • The problem we have currently is that we are focused on creating specific types of circuits. What we should be doing is working on dirt-cheap generic circuits that can reconfigure into anything you want aka neural network chips. The advantage of these is that you can have flaws in the fabrication process and make up for it by just making a shitload of identical neurons. You could even make up for having a low speed system by having an ungodly number of neurons in a single machine. This opens the door to

    • by Sycraft-fu ( 314770 ) on Friday June 09, 2017 @09:55PM (#54589705)

      Generic chips that can be programmed in to anything you want in the field. It's a huge industry, they get used in everything from your car to your TV, but they have limitations that means they are never going to be a be-all, end-all.

      There's a place for processors, FPGAs and ASICs, usually all combined.

      • The problem is that FPGAs are really expensive. Neural networking chips are going to be the death of FPGAs because they can be made with cheap-o fab systems.

        • Re: (Score:2, Informative)

          by Anonymous Coward

          No they're not because FPGAs and neural networks are different architecturally and serve different purposes.

    • by tgv ( 254536 )

      Neural networks don't work for generic tasks. Try to sort a bank's records with a neural network, or try to use them to display a UI description.

      • Wow, you aren't even trying to hide your ignorance.

        • by tgv ( 254536 )

          Please explain how a neural network would sort bank records. Or how to implement Excel using a neural network. Or ...

          • Forget implementing Excel, you could run the damn original by creating an x86 processor in a neural network. You do realize that brains are neural networks, right?

            • by tgv ( 254536 )

              You do realize that our brains are really, really imperfect: forgetful and illogical?

            • You think we should add an additional layer of abstraction and computation that slows things down and eats up more energy to arrive back at the exact same spot we already were (running x86 code)?

              Why would you want to do that?
              • I never said you would, I said you could. Know the difference.

                • Uhhhh, that was your response to how to sort bank records, that you "could" implement x86 on a neural network, which leads to a couple questions:

                  1 - If you only said "could" knowing that it's not really a good solution, why post it all?

                  2 - Given that you seem to agree that is not a good solution (e.g. not efficient), what is your answer to the poster that asked you about how you would sort bank records? (the implied full question of course is "how would you do it efficiently with NN compared to current
                  • 1 - If you only said "could" knowing that it's not really a good solution, why post it all?

                    To make a point about the flexibility of NNs, duh.

                    2 - Given that you seem to agree that is not a good solution (e.g. not efficient), what is your answer to the poster that asked you about how you would sort bank records?

                    Processors designed for NN chips will be faster and scale to new heights. Also, sorting bank records is actually something trained NNs would kick ass at.

                    • by tgv ( 254536 )

                      > To make a point about the flexibility of NNs, duh.

                      You don't seem to get that for a neural network there is no difference between memory and processing.

                      > sorting bank records is actually something trained NNs would kick ass at.

                      I really would like a demonstration of that. Input: a few million bank records. Output: the same bank records, ordered according to some criterion like SSN, bank account or credit.

                      But you're a troll, aren't you?

            • No, they're not. A brain is a neural network as much as a tree (plant) is a tree (data structure)

              I say that with a bit of rhetorical fun, but neural networks aren't actually made from neurons. They don't get drunk and have sex (in no particular order). They don't have a chemical nature and unknown features not the computer "neurons" have DNA.

  • by Anonymous Coward

    On the minus side, they will create Skynet or something more insidious, with slimy humans in charge.

    On the plus side, isn't this how you would make hardware ideal for a raytracer? Fixated on memory, a big heap of global memory (can't really dice the memory locality too much, rays have to go all over in the whole scene) with huge bandwiths flying everywhere, and you just want do zillions of intersections.

    The "Graph Acceleration Processor" could operate on e.g. a sparse octree, to give a simple example?
    I woul

  • by tietokone-olmi ( 26595 ) on Friday June 09, 2017 @10:59PM (#54589965)

    An application-specific integrated circuit, or ASIC. Not a new type of CPU.

    • It's an ASIC in the sense that a CPU is an ASIC.

      • Programmability is commonplace in accelerator chips. For example, GPUs. The article's chip cannot support an operating system on its own.

  • OK, I'm not making any judgments whatsoever about this new architecture, but...ANY radical change in our "accepted" mode of thinking cannot help but be a GoodThing(TM) - ultimately - IMO. $.02
  • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Saturday June 10, 2017 @01:45AM (#54590325) Homepage

    This reminds me of the Content Addressable File Store [wikipedia.org] that ICL developed some 50 years ago. OK: different implementation, but today a huge amount of RAM is affordable whereas CAFS needed to search for the data on disk.

    • Comment removed based on user account deletion
  • Comment removed based on user account deletion
  • This kind of weird name is given to pie-eyed future technology projects so that when the dust settles no-one really knows precisely what didn't pan out.

    Because odds are, they're going to have to fund this again—with an inkling of clue & a vaguely comprehensible name—before this twinkle finally deposits a nugget, third time lucky.

  • Back when I was in university for engineering, I had a professor tell us that there was nothing better than Von Neumann architecture so don't bother looking for it. One has to wonder what else university professors are wrong about.

God made the integers; all else is the work of Man. -- Kronecker

Working...