Japan Eyes World's Fastest-Known Supercomputer, To Spend Over $150M On It (reuters.com) 35
Japan plans to build the world's fastest-known supercomputer in a bid to arm the country's manufacturers with a platform for research that could help them develop and improve driverless cars, robotics and medical diagnostics. From a Reuters report: The Ministry of Economy, Trade and Industry will spend 19.5 billion yen ($173 million) on the previously unreported project, a budget breakdown shows, as part of a government policy to get back Japan's mojo in the world of technology. The country has lost its edge in many electronic fields amid intensifying competition from South Korea and China, home to the world's current best-performing machine. In a move that is expected to vault Japan to the top of the supercomputing heap, its engineers will be tasked with building a machine that can make 130 quadrillion calculations per second -- or 130 petaflops in scientific parlance -- as early as next year, sources involved in the project told Reuters. At that speed, Japan's computer would be ahead of China's Sunway Taihulight that is capable of 93 petaflops. "As far as we know, there is nothing out there that is as fast," said Satoshi Sekiguchi, a director general at Japan's âZNational Institute of Advanced Industrial Science and Technology, where the computer will be built.
Question (Score:1)
Does it meet the minimum system requirements for VR?
Re: (Score:2)
Is this the new "flash without lag" meme?
What do you do with the old ones? (Score:2)
So what do you do with all the old supercomputers when they're too big/power hungry vs performance? Helluva paperweight.
Re: (Score:2)
You can upgrade them. It isn't like these are monolithic computers. You can upgrade network capabilites, change processors.
Re: (Score:2)
Re: (Score:2)
So what do you do with all the old supercomputers when they're too big/power hungry vs performance? Helluva paperweight.
If they're anything like me, they'll put it on a shelf in their basement.
25 years later, they'll try turning it on again just for fun. One of four things might happen, with roughly equal probability:
Post-patent supercomputing (Score:3)
The second silicon revolution will be marked by price freefall as soon as enough patents expire and enough high-output[1] factories can come on-line. Eventually (at the most, perhaps a few decades from now), a major world government will realize that if they buy their own factories, keep 'em cranking out single board machines and flash memory at full speed 24/7 and if need be use some superfluous power plant that was about to be decommissioned... they could build a supercomputer the likes of which the world has never seen. Maybe someone more knowledgeable than myself could do some back of the napkin estimates here about what should be possible?
As an interesting aside, the power supply, display and input devices will end up becoming the most expensive parts of most consumer electronics, but I think the more interesting question is what the hell are they going to do with all of that computing power once the price floors give way? Protein folding, cryptography... and general AI.
1. The "high output" bit being the kicker. I know very little of the details of chip lithography, so maybe there are hangups here I'm unaware of.
Re: Post-patent supercomputing (Score:3)
Re: (Score:2)
The problem is that microchip foundries and dies are massive investments. Still, for a major world government, sinking $10 billion into a foundry wouldn't be an issue, especially since silicon looks to be bottoming out.
Yes, but it's a problem that will *eventually* go away. The per-unit costs are just way too small, and the potential upside is way too large for it to not happen in our lifetimes[1]. A $10 billion cost is on the order of the LHC, but unlike the LHC such a project would offer massive ongoing practical benefits in addition to theoretical research potential.
1. [waiting for the "I'm 73, you insensitive clod!" remark]
Re: (Score:2)
Re: (Score:2)
I had similar ideas the other day. The real problem here is that you eventually hit a power wall as you continue to deploy new stuff en masse.
That's why I explicitly mentioned superfluous power plants. There was just a story the other day about coal plants closing. Well... keep one online. How many flops can a single, large coal plant going full-tilt give you? I bet it's a lot. I bet the bottleneck, from a super-project perspective (not necessarily home user), isn't going to be the electricity.
Re: (Score:2)
Re: (Score:2)
I had similar ideas the other day. The real problem here is that you eventually hit a power wall as you continue to deploy new stuff en masse. Nevertheless, at least certain types of computing could constitute flexible demand, so you might want to power them with any extra generation you have at the moment.
It's also worth noting this doesn't prevent an explosion of dirt-cheap flash memory from happening at some point which has its own set of interesting implications. Particularly if high quality video sensors also fall in price...
Re: (Score:2)
Re: (Score:2)
For what purpose? (Score:2)
Probably just want to explore all of No Man's Sky [wikipedia.org] ...
Sound impressive! (Score:2)
---
Asking the important questions since 2016
manishs == crap (Score:2)
WTF is ÃZNational?
$150 million? (Score:2)
That's hardly pocket-change, but it seems cheap for what would be the world's fastest-known supercomputer.
For comparison, Tianhe-2 [techtimes.com] (in number 2 spot) cost about $390M to build, and Sunway TaihuLight [wikipedia.org], the current number 1, went live in June of this year at a cost of $273M.
Probably x86 clusters (Score:2)
Re: (Score:2)
Is that the name of the new supercomputer?
If not, learn to write proper titles.
I think you need to learn to recognize the difference between a noun and a verb.
US commissioned two exaflop computers (Score:2)