Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×
Space The Military Hardware

Looking Back at 1984 Report On "Radical Computing" 183

An anonymous reader writes "The Department of Defense has just released a long restricted report (PDF) by the JASON group entitled Radical Computing. This 1984 study outlines a number of alternate computing methods that could 'result in a radical improvement in computing.' The study attempts to explain the paradox of how the Russian lag in developing VLSI chips curiously did not critically hinder their accomplishments in space missions, ICBMs and chess computation. The authors speculate that the Russians might have achieved breakthroughs in alternative computing methods such as residue arithmetic and symbolic computing. (More cynical types assume the Russians bought or stole US chips from the French or other too-helpful go-betweens.)"
"The paper, published by the Government Attic website, also mentions how, eventually, highly parallel computers could make use of these alternative computational methods. Also discussed are such things as functional programming, interval arithmetic, recursive machines, multiple processor concurrency, fast recurrence evaluation, DDA machines, data-flow, and hyper-column cortex model. Which of these ideas ever came to fruition?"
This discussion has been archived. No new comments can be posted.

Looking Back at 1984 Report On "Radical Computing"

Comments Filter:
  • by wiredlogic ( 135348 ) on Saturday April 24, 2010 @06:42PM (#31969910)

    This report is unsurprising... the Soviet approach just seems so stupid to any Western engineer unfamiliar with it

    It isn't exactly stupid. It's just a continuation of the typical methods of engineering before electronic computers became integral tools in the process. With the ever advancing and sophisticated technology developed in the 20th century they needed to distribute a larger work load across more workers.

  • by Anonymous Coward on Saturday April 24, 2010 @10:18PM (#31970982)

    They are talking about hardware based on residue arithmetic to get around slow hardware speeds by making use of look-up tables that would be impractical with normal binary representation. The problem according to the paper is that this requires you to convert the representation of your numbers before doing a number of common tasks, including comparison, division, and overflow detection.

    I didn't need to read the paper to know what they were talking about (but I looked into it and found exactly what I expected: simple number theory to reduce a large space of numbers into a "conceptual" hash table). Look up the "Chinese Remainder Theorem". This is taught to first or second year college students. If I had studied computer science instead of mathematics, I would have known about hash algorithms, in detail, as a college student. (As it happens, I did algorithms analysis later)

    As for symbolic computing, they basically propose that the Russians might have created a super-optimizing compiler that can simplify complex equations to work "smarter not harder."

    Complex equations? Do you mean "complex programs"? You simplify equations by substitution and domain specific reductions.

    In any case, this is the whole point of functional programming. Programs are constructive logical proofs, and vice-versa. If you get that, you will see that these is only one large-scale transformation that can be done to improve proof efficiency without adding extra primitives: the cut. Anything more is domain bound (which can be fair enough, if you don't expect the optimization to work on every program)

    The Russians might have been ahead just for realizing the importance of keeping things symbolic, instead of trying to divorce mathematics from computer science. They might have been ahead merely for demanding that their computer scientists study mathematics.

  • by Animats ( 122034 ) on Saturday April 24, 2010 @11:58PM (#31971374) Homepage

    So that's what the JASONs were doing back then. All that stuff on "residual arithmetic", because they apparently thought that N-bit multiplication required O(N) cycles. By the late 1960s, high-end mainframes (CDC 6600, STRETCH, LARC, etc.) had multipliers that could beat O(N), by adding up the partial products pairwise as a tree. That approach is O(log N). This report was written in the mid-1980s, by which time that technology had filtered down to most larger CPUs. Today, of course, every serious microprocessor has it. "Residual arithmetic" just isn't needed. Most of the advantages of that approach were achieved, but by more straightforward means.

    However, division using table lookup is widespread. Modern dividers have sizable hard-wired tables. See "Pentium Floating Point Bug" for details.

    Data flow machines did catch on. They're just invisible. Inside the Pentium Pro/II/III and later machines is a data flow engine. That's part of how superscalar machines work. But, again, it wasn't necessary to export that painful paradigm to the programmer-visible level. (GPUs, though, are close to data flow machines.)

    The paper on "automated programming" is amusing. This was written just when the "expert systems" fad was tanking, as it was becoming clear that "expert systems" just didn't do very much. The "AI Winter" followed.

    I recognize too many names on the distribution lists for those reports.

  • Most practical problems can be solved with marginally acceptable accuracy without computers. In "the old days," modeling efforts were utterly crippled by the lack of computers so we had to give everything a good margin for safety and hope it was enough.

    Try to design an engine that meets modern emissions requirements without a computer.
    Try to make detailed predictions about the behavior of any circuit containing multiple transistors without a computer.
    Try to design a modern-scale bridge without a computer.
  • by S3D ( 745318 ) on Sunday April 25, 2010 @10:28AM (#31974262)

    The Soviet approach was also very stereotypical: get an army of mathematicians and engineers to find exact analytic solutions to the problems you're trying to solve.
    But nope, just a lot of brainpower misdirected into a lot of horribly inefficient pursuits.

    You are wrong here on both accounts, though somehow close to truth. I was one of those engineers who worked with PDE at and through the end of the Soviet Union. Finding "exact solution" nether was a priority or purpose of research, it mostly impossible anyway. Actual approach was to find more efficient and stable algorithms, that is to compensate lack of computational power with better usage and understanding of underlying math. That was causing emphasis on different multiresolution and adaptive methods and application of stability theory []. Not that it was much different from western approach. But I have seen many times books on bifurcation theory and topological dynamics sold by street vendors.

"Only a brain-damaged operating system would support task switching and not make the simple next step of supporting multitasking." -- George McFry