Sun Moves Into Commodity Silicon 236
Samrobb writes "According to Sun Microsystems CEO Jonathan Schwartz, Sun has decided to release its UltraSPARC T2 processor under the GPL. Schwartz writes, 'We're announcing the fastest microprocessor we've ever shipped this week — delivering 89.6 Ghz of parallel computing power on a single chip — running standard Java applications and open source OS's. Simultaneously, we've said we're entering the commodity marketplace, and opening the chip up to our competition... To add fuel to the fire, the blueprints for our UltraSPARC T2... the core design files and test suites, will be available to the open source community, via its most popular license: the GPL.'" Sun is still working on getting these released; early materials are up on OpenSPARC.net.
Nothing to see here, please move along. (Score:4, Insightful)
But seriously, what's the real point? Are the means to actually make one of these processors beyond 99% of companies and pretty much 99.99% of the people on the planet? What about the patenting of the process or equipment to actually make the processor?
Commodity is a relative term... (Score:4, Insightful)
Although I submit it would be really cool to just manufacture these things in my garage.
Re:Nothing to see here, please move along. (Score:3, Insightful)
now most people in developed countries use dozens (including embedded systems) every day, and a desktop of awesome (by 1960s standards) power can be had for a few days salary
think big, cast aside pre-conceptions
FPGAs (Score:4, Insightful)
Many FPGA houses provide free ARM cores etc for inclusion on their FPGAs. You can build an ARM-based (or other core based) device using free download tools and run it on an FPGA that costs a few bucks. To do this the licensee need to pay a heft licencing fee to ARm or whomever. Now they can also distribute GPL cores.
But is this really useful? To use a GPL core would mean that all the rest of the chip design would have to be released too. Very few hardware builders will be prepared to release their silicon source code because that is often the only way they have of preventing mass knock-offs etc.
Re:Power consumption? (Score:2, Insightful)
You don't have to. But you can not have the fastest CPU. No matter how efficient they make the chip you will save power running it slower. There will always be a market that will trade off everything for speed. So yes you will always have to trade off power efficiency for speed. But and this is the big one. CPUs are getting faster per watt. An AMD x2 isn't a slow cpu. You can get them that only use 65 Watts of power and they are cheap.
They will be fast enough for just about anyone.
Re:GPL and chips (Score:3, Insightful)
Re:Sweet (Score:4, Insightful)
The important thing to note is that the "jokes" often lack humor, so recognizing them becomes a terrifying ordeal of memorizing the groupthink prejudices.
Re:tivoization? (Score:3, Insightful)
Yes. You, the end user, can modify the processor to the extent that is possible for the technology involved. Since a processor is physical hardware, that means the "compilation" phase for modification involves a microprocessor fab. If you don't have one, that sucks - but it's not something that's possible to fix.
Even Richard Stallman, and even the GPLv3, wouldn't complain about you not owning a fab (and therefore not being able to use a modified UltraSparc T2 in practice) as a freedom issue. This isn't like Tivoization, because Sun can't patch your physical hardware either.
Interesting how Sun finds this an advantage.... (Score:2, Insightful)
It looks to me like Sun has figured out that they are in a knowledge business. They are operating on an information theory algorithm. They are creating a much larger pyramid of customers for their particular computer knowledge. There is a new enormous bottom layer of people contemplating using this fascinating powerful chunk of information. They are emitting information, not hardware.
The thing from information theory is: The more high quality information a source emits, the more valuable the source becomes. Sun still has the stable of PhD researchers from U.C. Berkeley and some more wags from Stanford. So the company will continue in the business of emitting information.
If you want the Macintosh of mainframes, they will sell them to you. If you want to boot Solaris or wire up Sparc chips you are still their customer. Sun will be your first publisher, web site and consultant.
It seems to me that this is a business innovation. It has been 60 years since Shannon's information theory paper suggested that the source that emits information increases in entropy. Sun is doing that by making available a uniquely sophisticated design - not hiding it in file cabinets in the basement.
Re:I'm not sure if people are getting this. (Score:3, Insightful)
Re:Various options. (Score:3, Insightful)
the only things tying linux to certain architectures are flash, nvidia, ati etc. in other words, the proprietary software companies are stifling innovation, just like they didn't allow intel to create a superior processor. however, the strength of the strangle hold on linux is a lot weaker.
Re:Power consumption? (Score:3, Insightful)
You have a short little span of attention. When Intel first hit 60W with the original Pentium there was a huge outcry about its outrageous power consumption, and it hardly performed any better than a 100MHz 486, either. After a quick die shrink, the next version wasn't so bad. Now Intel sells the Core Duo at 65W as a major innovation in power management. After Intel's Prescott, it's almost impossible for anything else to look bad. But really, should a product that never deserved to be made in the first place define the frame of reference moving forward? If you factor the environment into the picture, a TDP of 35W would look far more responsible.
Re:Various options. (Score:3, Insightful)
Executing multiple instructions within a single "opcode" - and then developing a compiler to pre-determine the best path was about the STUPIDEST idea I've ever heard. Just think about it... a compiler has no idea about the REAL conditions at runtime.
A compiler can optimize a single program thread - but can't optimize for multi-threading, multi-processing, or mixed mode execution between the OS and the application. All these things depend heavily on hardware - and Intel/HP made the hardware as stupid as possible in the places where it really mattered.
They optimized for a problem (one instruction per cycle) which was easily overcome with parallelism (smp or multi-core) and faster clocks. Instead they choose the most complex assumption (multiple instructions per opcode, with different possible results) and sits heavily on an overloaded branch-prediction unit which is partially implemented in the compiler.
Itanium has improved and I'm sure they'll be around for a few more years. If Intel had been smart we'd have 8 core Alpha chips instead - as they own the Alpha intellectual property.
Re:Sweet (Score:3, Insightful)