Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Japan Graphics Power Hardware Technology

Japan Unveils Next-Generation, Pascal-Based AI Supercomputer (nextplatform.com) 121

The Tokyo Institute of Technology has announced plans to launch Japan's "fastest AI supercomputer" this summer. The supercomputer is called Tsubame 3.0 and will use Nvidia's latest Pascal-based Tesla P100 GPU accelerators to double its performance over its predecessor, the Tsubame 2.5. Slashdot reader kipperstem77 shares an excerpt from a report via The Next Platform: With all of those CPUs and GPUs, Tsubame 3.0 will have 12.15 petaflops of peak double precision performance, and is rated at 24.3 petaflops single precision and, importantly, is rated at 47.2 petaflops at the half precision that is important for neural networks employed in deep learning applications. When added to the existing Tsubame 2.5 machine and the experimental immersion-cooled Tsubame-KFC system, TiTech will have a total of 6,720 GPUs to bring to bear on workloads, adding up to a total of 64.3 aggregate petaflops at half precision. (This is interesting to us because that means Nvidia has worked with TiTech to get half precision working on Kepler GPUs, which did not formally support half precision.)
This discussion has been archived. No new comments can be posted.

Japan Unveils Next-Generation, Pascal-Based AI Supercomputer

Comments Filter:
  • Pascal-based? (Score:5, Insightful)

    by davidwr ( 791652 ) on Friday February 17, 2017 @06:44PM (#53889333) Homepage Journal

    Am I the only one that thought "LISP machines, okay, but Pascal [standardpascal.org]?"

    • That is actually the first thing that sprang to mind, even though I had been looking specifically at Pascal based GPU's recently. :-)

      • Ditto here. I had some memories of long dead undergrad programming courses.
        • "Ditto here. I had some memories of long dead undergrad programming courses."

          Yes, Pascal, really? If we're going to invoke the age of steam, why not go full Fortran with this project?

          • by JanneM ( 7445 )

            Fortran is still very much in use for mathematical programming.

          • Makes no sense. Pascal is a perfectly fine language, even classic Pascal. Fortran, especially classic Fortran 77, is really lacking in many ways (newer versions add newer features but that's like comparing Visual Basic to Basic and trying to call them the same language). Granted Modula-II or Ada is much more suitable than Pascal.

            • Fortran, especially classic Fortran 77, is really lacking in many ways

              But perfectly adequate for solving differential equations, which is the main task of scientific and engineering programs.

        • by mikael ( 484 )

          I'm worried the animal activists might take offence at peta-flops.

      • by Anonymous Coward

        I'm older school and keep hoping for an ALGOL supercomputer.

        • by paai ( 162289 )

          Perhaps APL should be revived. Imagine the money to be made by keyboard manufacturers.

          Paai

          • APL was effectively superseded by the J language [wikipedia.org].

            I taught myself some APL[1] using a standard keyboard and an APL implementation that ran on OS/2, if memory serves. Wasn't bad. Yes, you had to compose all the APL characters using Alt-key sequences[2], but APL is so terse that you didn't have to do it very often.

            J uses ASCII rather than APL's grab-bag of symbols. I'm not sure I'd call it more readable, though.

            It would be interesting to see a less-opaque Iverson-Backus language, with the matrix operators and

        • ALGOL supercomputer.

          Now we have multi-threading, Algol68 might actually be the best choice. However, history supports C-more Cray.

    • experimental immersion-cooled Tsubame-KFC system

      I'm sorry, but I like my KFC hot from the fryer. This experiment should end immediately.

    • ok, we're on the same page. japan, wake up.
    • I was thinking "Mr Wirth would be very happy..."

    • by sycodon ( 149926 )

      Umm...is this the Pascal as in the programming language?

      • Re:Pascal-based? (Score:5, Informative)

        by ShanghaiBill ( 739463 ) on Friday February 17, 2017 @08:09PM (#53889731)

        Umm...is this the Pascal as in the programming language?

        Does anyone still program in Pascal? The last time I saw it on a resume was more than 20 years ago.

        Anyway, this supercomputer has nothing to do with the Pascal programming language. It is built using NVidia Pascal GPUs [wikipedia.org].

        • Some people do. It is a very well defined language worth looking into even if only for academic reasons.

          Some would say that everything old is new again; good design merits consideration.

          I am not suprised by this. When faced with coming up with something better, why not choose something old (and proven) rather than try to create something new?
          • Re: (Score:2, Redundant)

            by jbolden ( 176878 )

            Well first off the super computers aren't about the Pascal language but the Pascal chip. I'd disagree that Pascal was all that proven out. It seemed very quickly to have had structural flaws which caused other languages to overtake it. Pascal was fairly low level yet it lacked good low level interfaces. Which is why it lost out to C. Pascal supports admit this and one of the main directions of Turbo Pascal / Delphi was to introduce into Pascal handling for lower level code (example partial compilation)

            • Sorry but eveything you write about Pascal is wrong. But I'm to lazy to correct it, you can read wikipedia.
              You clearly never used it, so why write that bullshit?

              Pascal, especially as UCSD P-System was for decades the most widely used 'OS' and programming language on the planet.

            • I don't know if other languages overtook Pascal because of 'structural flaws', but I myself adopted C over Pascal because at the time it was obviously better for raw power and speed. This is true for the processors and architectures that I have developed on (6502/x86/SPARC/AMD64).

              As much as I love C, it is not a good fit on other architectures such as stack machines. Burroughs machines (now UNISYS) use an extended version of ALGOL as a system language with impressive results, especially when it comes t
        • by Lips ( 26363 )
          Inno Setup is an excellent free installer for Windows programs. It's scripting language is Pascal. http://www.jrsoftware.org/isinfo.php/ [jrsoftware.org]
        • by sycodon ( 149926 )

          OK, thanks.

          I liked Pascal. It was very elegant.

        • Comment removed based on user account deletion
    • Being a Pascal programmer I felt horribly bait and switched.

    • by Anonymous Coward

      The real question is, did anyone NOT think that?

      And, more importantly, just how calculated was that clickbait title, and who or what calculated it?

    • by s.petry ( 762400 )
      Pascal has always been exceptional performing math. Makes sense for a super computer IMHO. The fact that I wrote pascal for my math degree is bonus!
      • Pascal has always been exceptional performing math. Makes sense for a super computer IMHO. The fact that I wrote pascal for my math degree is bonus!

        Are you sure you're not thinking of Fortran? I've never heard that Pascal is unusually good at math compared to other general purpose programming languages.

        • by s.petry ( 762400 )
          I programmed Fortran my first year, but after that we used Pascal. Pascal was not as fast for some operations, but in others was faster. C was no match, and yes I took a year of C and even a semester of Cobol just so I could say I did.
        • He meant Pascal, the human. Not Pascal, the language.
          And Pascal as languge is as good as C in doing math ... or basically every pro edural language.

    • by Anonymous Coward

      but Pascal [standardpascal.org]?

      It's more like Pascal [freepascal.org] or, alternatively, Pascal [embarcadero.com].

  • These P100 come with sweet HBM2 and around 500GB/s in memory bandwidth... everything based on dense linear algebra (AI, but also physics simulations) is basically flying on them.

  • RTFA (Score:4, Insightful)

    by subk ( 551165 ) on Friday February 17, 2017 @07:47PM (#53889629)
    Pascal, the GPU design. Not Pascal, the language.
    • It does make one wonder what they were thinking - if they were at all - when they chose the name.

    • Pascal, the GPU design. Not Pascal, the language.

      So... It won't run Delphi then?

      • Delphi closed, along with BIX, Prodogy and all the other AOL like online services of the early 90s.

        • Delphi closed, along with BIX, Prodogy and all the other AOL like online services of the early 90s.

          They're all gone? Compuserve too? Sad!

  • I think I'm one of few people that actual likes Pascal. Also prefer Python over JAVA and never really cared for C all that much even though there are similarities. Anyone like or use MyNotex (Linux)? Written in with Pascal. ;)
  • by 140Mandak262Jamuna ( 970587 ) on Friday February 17, 2017 @08:55PM (#53889911) Journal
    It keeps coming back. Massively parallel machine, thousands of cores all working in parallel. Naively multiply add up all the megaflops and get some massive number and tout it big. We can simply add all the flops of all the servers in some Amazon cloud and claim that is the super computer. Back in the 80s "transputer" was all the rage. Before that it was the "vector" computers. Then "the network *is* the computer", then GPUs...

    As of now there are very few applications for massively souped up GPU processes. Fluid mechanics loves this GPU. Navier-Stokes is probably the most difficult equation to solve, agreed. But it is hyperbolic, with limited "zone of influence", and numerical equations are quite simple, just mass, momentum and energy balances in the control volume. It plays well in GPU, the calculations fit inside the teensy memory and processor. All time domain problems are hyperbolic and they all can be ported to GPU, theoretically. But try squeezing Maxwell's Equations into that teensy processor!

    Graphics card companies are desperately looking for new markets and they keep pushing this. They might as well push a wet noodle across the table. It ain't gonna go nowhere it didn't wanna go.

    • I wouldn't call a modern GPU that constrained really. The only thing they lack is memory protection. It's also a lot easier to program on a GPU ever since the SIMT paradigm came out (i.e. CUDA, OpenCL). Also plenty of modern processors come with a GPU on the same die as the CPU. Like nearly all smartphones for example.

      • by slew ( 2918 )

        I wouldn't call a modern GPU that constrained really. The only thing they lack is memory protection. It's also a lot easier to program on a GPU ever since the SIMT paradigm came out (i.e. CUDA, OpenCL). Also plenty of modern processors come with a GPU on the same die as the CPU. Like nearly all smartphones for example.

        Actually all modern GPU have had memory protection for several generations. The problem GPUs have is that don't generally have full support for demand paging and precise exceptions (SIMT makes that pretty expensive). For example, putting in hardware for highest possible performance and hardware to be able to hit a page fault and be able to clean up and restart multiple threads (that might be communicating or synchronizing states) are two different hardware optimization points. That being said, some limit

    • This one is targeted towards neural nets. Does that work ?

  • Making all the other supercomputers Wirth-less.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...