Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware

First LEON Silicon Tested Successfully 94

uglomera writes: "LEON, the open-source CPU developed for the European Space Agency, has been successfully manufactured and tested by Atmel on an Atmel ATC35. Gaisler Research, whose CEO Jiri Gaisler wrote the VHDL model of LEON, also offers a real time kernel, simulator, a cross-compiler, etc. for this SPARC-family processor designed for space applications. Check it out." You can find more good information on the LEON processor on the Gaisler site, including diagrams and further reading. Open Source hardware running Free software -- wheee!
This discussion has been archived. No new comments can be posted.

First LEON Silicon Tested Successfully

Comments Filter:
  • by Anonymous Coward
    and LEON is getting larger! [mattsmusicpage.com]
  • I have a few minor reservations about using an open source developed processor for missions in space. Open source is great, and produces some of the most well written, secure code that is available, anywhere, but I'm not sure we should be trusting this technology to aide in piloting a space vehicle. A space flight is a dangerous and EXTREMELY complicated process, and if a single component fails, or produces an error, the results can be catastrophic.

    NASA, in the USia, spends literally millions of dollars designing the space shuttle's computing system from the ground up. Their in-house coders pour over each microchip and line of code hundereds of times looking for even the smallest bug. This superior attention to detail is not possible using the limited resources of the open-source method. It is also why NASA can succeed in safe, reliable space flight time, and time again, while other space programs are struggling. Open source has it's advantages, but when 100% reliability is necessary, it may not be the best option.

  • by Anonymous Coward on Monday March 26, 2001 @05:28AM (#339932)
    More stuff like this can be found here: opencores [opencores.org]
  • I doubt they plan to send this widget up without a full seperate functionality audit, which they can do for an open source processor, but it might be hard to convince say Intel to let them pore over the VHDL for the celeron or whatever.
    Just because it's open source doesn't mean they aren't going to put it through the same rigorous tests that they would put a commercial processor through.
  • Not wanting to bait flames here, but...
    It is also why NASA can succeed in safe, reliable space flight time, and time again, while other space programs are struggling.

    How exactly would you back up your claims about this? The only other country with "space flight time" (I assume manned flights there; if you're also talking probes or satellites your affirmation is awfully clueless/uninformed) is Russia, which is broke. Mir was scheduled to fall because of lack of funding. Not because of a technical failure.

    Just wanted to set things straight.

  • The article is talking about a processor, not code that runs on the processor. What about the problems [uclan.ac.uk] faced by commercial, closed processors? I think this is a great step in the right direction; a synthesizable VHDL processor that can target FPGAs.

  • NASA, in the USia, spends literally millions of dollars designing the space shuttle's computing system from the ground up. Their in-house coders pour over each microchip and line of code hundereds of times looking for even the smallest bug. This superior attention to detail is not possible using the limited resources of the open-source method.
    That's exactly where open-source shines. Anyone can look at the code. Look at OpenBSD for instance. LEON could end up being carefully inspected by more knowledgable people than any other processor design ever used by any space agency. Heck, if the agencies are worried about it, they can pour over the design just as they would for any other chip.
    --
    Patrick Doyle
  • Open source hardware seems a bit odd. The nice thing is, yes, we can have the specs to make apps really fast. Instead of, for example, having to wait for Intel to decide to contribute to gcc, kernelsm or (finally) release their in-house (Linux) compiler. So a more open spec on the processor would allow people to actually get things right. I guess the question that sits on my mind is what is truly more valuable, a truly wide open spec or a more open design so anyone can manufacture one.

    I guess my feelings are known because I believe that hardware's design should not be completely open. Let a manufacturer keep it all to themselves. However, when you buy the processor you should have the _complete_ spec available. (which is way different than Intel does now)

    ---
    My opinions do not represent those of my employer of course.
  • You have a good point...but if you combined the in-house efforts with open source you add even more eyeballs (a Good Thing).

    And as the open source side of the equation is free to tinker without adhering to strict guidelines that the in-house coders are working under, the potential for getting bitten by tunnel-vision is minimized. By having contributors with a different perspective you could spot problems in the abilities and scope of the project that can be totally unrelated to whether the code is buggy or not.

    Kinda like how HELLO WORLD is bug-free... but that won't help a blind man use it worth a damn.

  • In a word, yes. Having it open source doesn't mean you can't be as careful as you want. It doesn't mean that you can't spend millions of dollars and man hours with in-house coders poring over the design.

    If it is done properly, it just means that other people can look at it carefully, too.

    As long as there are reasonable controls (analogous write access to a public CVS server), an open source hardware project like this should, in theory at least, be able to do a better job than NASA can dream of. Now as for practice, we
    will have to see. But there is no rational reason to discount this out of hand.

    On the other hand, if you just throw the design open to the public and expect it to get better/tested on its own, you are a fool. But there is no reason to think we are dealing with idiots here.

    S.
  • since when does open source mean unfunded?

    there's no reason nasa's code couldn't be open sourced and have the exact same results.
  • by n3rd ( 111397 ) on Monday March 26, 2001 @05:45AM (#339941)
    I was speaking to a co-worker the other day about Sun's UltraSPARC III processor and he was telling me about CPU manufacturing in general.

    To actually create a fabrication facility to make CPUs, it takes about 20 billion dollars. $20,000,000,000 dollars. That's more than most companies can afford. Even Intel couldn't make very many new fabrication plants.

    If a company can't afford to create their own plant, they have to schedule time at a fabrication facility. This is basically a window (say 48 or 72 hours) where the facility will crank out as many chips as possible. If they miss the window for some reason, they have to re-schedule and it can be months until there is another open time slot.

    What I'm getting at is designing Open CPUs is a great idea. It allows developers to really get inside of the hardware and optimize the hell out of applications, which is a good thing. However, the actual cost to make these CPUs is staggering, and unless a big company put up some big bucks, I don't see it happening in the near future.

    My co-worker also mentioned how low cost almost everything else is. Video cards, NICs, sound cards and the like. Wouldn't it be better to focus on products like these since they would work with all hardware (how about a video card that worked on Sun and x86 machines?)? With Moore's Law getting us faster and faster CPU speeds, perhaps it's time to make the peripherals first, and focus on a CPU once we have found sucess with smaller projects.
  • I'm quite aware of the problems facing traditional closed-source processors. I'm not trying to imply that the Celeron or Athlon would provide a suitable alternative either. I remember Intel's problems with the Pentium floating point bug, and problems compiling the Linux kernel on Intel's ill-fated PIII 1.13 Ghz. No, modern PC processors would not make a reliable shuttle processor.

    If I recall, the US space shuttle runs on something like a dozen underclocked i286's, each processor with something like 5 way redundancy. Each processor in a set of five will perform the same calculations. The solution presented by the majority of processors is deemed to be failsafe. The 80286 was chosen because it is simplistic, and it's reliability has been proven through years and years of experience in the field. In addition, Intel has spent millions on development and testing of this processor to ensure it is 100% reliable.

  • by Rocky Mudbutt ( 22622 ) on Monday March 26, 2001 @05:46AM (#339943) Homepage
    Please to read the article(s) before firing off
    negative spin.

    LEON [gaisler.com]

    The LEON model also exists in a fault-tolerant
    version, capable of detecting and correcting
    single-event upset (SEU) errors in any register
    or RAM element. This is done completely in
    hardware without any software intervention.
    The area overhead for the fault-tolerance
    functions is approximately 30% while the timing
    penalty is around 5%. The fault-tolerant features
    makes it possible to use LEON in the severe
    space environment without having to develop specific SEU-hardened cell libraries.
    The LEON
    fault-tolerant VHDL model can under some conditions be licensed from ESA -


  • It sounds like you do no believe that the brits do anything other than run down to the local electronics shop, slap a bunch of stuff together and fire it off into space. I do not deny that space flight is dangerous, but the thought that a single component failure producing catastrophic results when talking about computer equipment is not bothering to think.
    If they are doing anything like what NASA has done in the past, then there is not one, but three computers for each group of tasks. These three computers vote, if one is consistantly out of sync with the others it gets shut down.
    I don't think that NASA has cornered the market on testing systems. As the public has even seen, Intel, and Motorola are not perfect. What the public does not tend to see are the less well known bugs which have existed and gone on with less public fanfare. The one I struggled with the most was a bug in the 386 chip which made it useless for network based semephors.
    Equating Open source with bug ridden and unreliable is to look upon what is currently available with closed eyes.
  • Open source doesn't mean limited resource, it means that the source is open to anyone that wants to inspect it. Just because the chip design is open source does not mean that it wasn't produced by full-time paid professional chip designers, on company time. It just means that beyond those who designed it, more people can inspect the code and suggest improvements.

    People have got to get over this modern idea that open source means produced in a bedroom by amateurs. Twenty-five years or so ago, most commercial software was open source - as in it came with source when you bought it. And people didn't complain about it, or claim that this somehow reduced it's quality.
  • by f5426 ( 144654 ) on Monday March 26, 2001 @05:49AM (#339946)
    > I guess my feelings are known because I believe that hardware's design should not be completely open

    Would you care to explain us why ?

    I see advantages in the fact that:

    1/ Everyone (with the knowledge to understand it) can see how the processor is done
    2/ Everyone (with the motivation) can learn from the processor core
    3/ Everyone (with the foundry) can manufacture a totally compatible processor
    4/ Everyone (with the technical skils) can improve the procesor design
    5/ (A few other ones)

    Of course you are not going to run a plant in your backyard, but I fail to see real downsides to the idea.

    Cheers,

    --fred
  • So it's so reliable that you need to do the same calculation five times and take the most popular result?

    Assuming that the processing demands of current and future space flights are increasing beyond what a dozen 286s can accomplish, what would you have NASA, ESA et. al. do? Use more and more 286s? Use a higher end PC processor? Use a normal workstation processor? Or use a radiation-hardened processor that is compatible with a popular, well tested processor? (SPARC)

  • NASA make most of their stuff open source.
  • As a matter of fact, I did read the article. You are wrong.

    A direct quote from the article [gaisler.com]:

    "LEON was initially developed by Jiri Gaisler while working for ESA, and Gaisler Research is now working under ESA contract to maintain and further enhance the model."

    The processor was developed under open-source methods by an independant company for the ESA.

  • If you want highly cost efficient production in huge quantities, you are right. If you want just one, you can use an FPGA (the article specifically mentions using Xilinx chips) and reprogram it at will - great for testing and development, or if you want a small series there are a number of companies producing small series of chips.

    The downside? Cost and performance. You won't get high-end performance out of a FPGA, for instance. But on the other hand, for an embedded system an FPGA give you the opportunity of "easily" developing a full system on a chip, test it, debug it in actual hardware, and depending on the volume you need decide whether to use FPGAs or custom ASICs in the final, shipping device.


  • Thousands of excellent designers work together from tens of nations, and they start mixing feet with meters again... Yikes!

    Now where did I leave my babelfish...

    Sofar.
  • Not to mention the CAD design software sells for about $100,000 a seat.
  • In fact, IIRC, China has planned a manned space flight to Mars slated for 2013
  • How about the industry becomes unprofitable because those in the hardware business (making chips...not selling a computer) are generally not making their cash in the services. I am a capitalist. I believe that companies should be able to make money. The Open Source model works so wll for software while companies can sell services while making less money on the software. Hardware does not work that way. While this does have its advantages to use the power of the open collaboration...it makes companies do the cheapest possible production (like RAM). Although it may help the quality of the design in the long run...it will not help the quality of the product or the stability of the company.

    For example, Company X, Y and Z make these cpus. Company X employs 4 people to work on the spec of this chip so they have their inhouse employees. Company Y employs 2 people. And Company Z employs none. Company Z has less overhead of employees to work on the product and can sell the same thing for less. That puts Company X out of business. In order to compete, Company Y has to lay of its employees that work on the product.

    Who or where do the tech people make a living off of this model? We can try to sell support is someone is writing a compiler for it...but...wait...the compiler is gnu and the money isn't there since the spec is open anyway. They will figure it out on their own. Where do we make our money? Maybe I fail to see the model correctly and I apologize if I am being ignorant to how this system will work.

    ---
    As always, my opinons do not represent those of my employer.
  • China and India also have space flight programs, as well as many other countries that have the ability to launch small, unmanned flights.

    Indeed. Europe does, too; with quite serious numbers and success rates, BTW. And they aren't particularly plagued by the failures we have come to expect from NASA, recently. How many probes did they lose or mishandle in the last few years?

  • The 286 (special version) was chosen because it was radation shielded relatively easily. They reused parts from previous missions, probably to save on R+D. I'm not totally sure, but I'm relativley sure thate the Hubble telescope runs on 386s (massive floating point performance here ;), that are radation shielded as well.

    I was just wondering: are the shuttle and Hubble programmed in ADA? Just a thought. yeech.
  • by Anonymous Coward
    I believe that companies should be able to make money.

    So what you're proposing is some kind of a industry tax? Free products should not be allowed because then the industry couldn't make any money?

    Bollocks. Nothing's more capitalistic than crushing your enemies by offering cheaper and/or better products. Taken to the extreme that means free quality products.

    What's your problem?

  • by ec_hack ( 247907 ) on Monday March 26, 2001 @06:26AM (#339958)
    The availability of an inexpensive, radiation tolerant CPU is a big win for space researchers. Right now there are darn few radiation-tolerant parts available for use in space applications due to decreased demand from the military. The International Space Station is using Intel 386s for embedded CPUs, as they are simple enough to be relatively rad-hard. More modern CPUs, such as in the laptops used on the ISS and Shuttle have about one lockup/day due to radiation.

    The design requirements for software controlled systems in space are so stringent that to do anything sophisticated requires incredible redundancy, cross checks among the systems, and increased design complexity, all of which significantly drives up design costs (and causes all kinds of debugging problems). Tell me three times is not enough, you have to tell three controllers three times, three different ways and then they need to cross check. This could be a big step forward for software geeks in space.
  • Just as proprietary software will continue to exist, closed hardware will almost certainly continue to exist as well.

    Why not just apply the appropriate business model to a company's needs? Open source hardware won't put anyone out of business, unless the open products turn out to be better than the closed ones...and isn't that what capitalism is all about? May the best product win?

  • I guess you also would have opposed internal combustion engines, since they drive the steam-engines out of business. Sure, companies should be able to make money in a capitalist system. But they do not have an inherent right to make a profit, just a right to try. when a company goes belly-up because of a faulty or outdated business-method, no crime has been committed (usually..). This is just a case of competition outdoing the older company.

    //rdj
  • I saw a number of projects on the net where people try to build a pure hardware codec using FPGA. It seems that the better approach is to build a CPU-based, hardware-assisted codec.

    Here is a great opportunity to free music from MP3 license payments. If somebody creates an open-source reference hardware/firmware implementation, Far-East companies will start making cheap portable players/recorders in no time.

    As for the development costs: many FPGA vendors provide their software for free or for a small price, because they make money on their chips. The only problem is a good Verilog/VHDL simulator. FPGAs themselves are pricey, but there are some one time programmable devices (Atmel, Quicklogic) that cost under 50USD.
  • by wiredog ( 43288 ) on Monday March 26, 2001 @06:32AM (#339962) Journal
    $20 billion? Hell, an aircraft carrier only costs about $3 billion. IIRC the newest Intel fab cost about $2 billion.
  • by BillyGoatThree ( 324006 ) on Monday March 26, 2001 @06:37AM (#339963)
    "Their in-house coders pour over each microchip and line of code hundereds of times looking for even the smallest bug. This superior attention to detail is not possible using the limited resources of the open-source method."

    Sure it is, here's how: Use exactly the process NASA uses know and that you are apparently comfortable with. Then ADD (not replace) more programmers by making the source available via FTP.

    Adding openness to an existing project loses nothing. Yes, shifting the burden of quality OFF of some process ONTO openness may not always be a good idea (not in one go, anyway). But adding more checks doesn't lower quality.
    --
  • ``Leon the pig farmer'' boards the shuttle? (-:
  • Hmm. I can just imagine... "Microns? I thought you said 'metres'!!!"

    Rich

  • The mir was not a very computerized station.
    Most of it ran by the old reliable mechanics that don't care if you need to reboot your control system.
    I read an article about the russian spaceprogram and one thing i remember is that they got most of their stuff up there by brute force.
    Actualy kind of funny.

    // yendor


    --
    It could be coffe.... or it could just be some warm brown liquid containing lots of caffeen.
  • Open Source doesn't mean it was developed by the masses. It just means it's licensed under an open-source license. It means that if others want the design, they can have it, and use it.

    You can bet that Nasa usest he same quality controls on these circuits as with any other.. they just release it to the public.
  • It is also why NASA can succeed in safe, reliable space flight time, and time again, while other space programs are struggling.

    If NASA were so great at this, then what the hell happened to the poor people onboard the Space Shuttle Challenger?

    The Russians have amazed everyone with their ability to use cheap components and good-enough systems to dramatically lower the price of their launches, and not spend $600 on every nut and bolt.

    And another thing:
    Open source is great, and produces some of the most well written, secure code that is available, anywhere, but I'm not sure we should be trusting this technology to aide in piloting a space vehicle

    We're not talking about a few guys hacking out a CD player over a weekend. We're talking about a serious project conducted by serious scientists. It would appear you are being seriously naieve.

  • I am a capitalist. I believe that companies should be able to make money.

    I am a capitalist. I believe that companies should provide high-quality items to the market.

    If they are good, they items will be bought...

    There is no natural/God given/whatever right for a company to make money.

    Companies that cannot produce something usefull should dissapear...

  • > How about the industry becomes unprofitable because those in the hardware business (making chips...not selling a computer) are generally not making their cash in the services.

    Nothing will prevent them to make chips. And, as a capitalist, you should think that it would even make a more efficient market and be utltimately good for the consumer.

    > I am a capitalist. I believe that companies should be able to make money.

    Like rambus or like AMD ? By locking the IP or by making a better/cheaper/more attractive product ? Only if it is the former, then I understand why you would object to open-source hardware.

    > The Open Source model works so wll for software while companies can sell services while making less money on the software.

    Well, 'Free Processor' would still cost money. Even if silicon wants to be free. :-)

    What the 'Open Source' model does is to provide everyone with the bare minimum of software. It turned the OS, the text editor and the compiler (for instance) into a commodity. Much more importantly, it releived us from 'vendor locks'.

    What an 'Open Source' model on hardware would do is to provide everyone with the bare minimum of hardware compatibility, and save people from vendor locks. I expect that you understand that having standard hardware (from isa-boards to usb devices) have been a good thing for the computer industry. Escaping the IBM vendor lock on PC have been one of the best things that occured to computer users.

    [crunch]

    > For example, Company X, Y and Z make these cpus. Company X employs 4 people to work on the spec of this chip so they have their inhouse employees. Company Y employs 2 people. And Company Z employs none. Company Z has less overhead of employees to work on the product and can sell the same thing for less. That puts Company X out of business. In order to compete, Company Y has to lay of its employees that work on the product.

    Making millions of highly clocked processor will still require a lot of skill, so I would not expect intel to be put out of business any day. Furthermore, open hardware can peacefully(?) coexist with closed hardware. The existence of GNU/linux did not prevent thousands of people waiting in line at each release of Windows (or Mac OS X).

    Open hardware may even help close closed companies. For instance FreeBSD enables Apple to cget back in the competition with Microsoft.

    > Where do we make our money?

    Your example may as well be: Company X employs 4 people to work on the spec of this chip so they can be the first to implement next revisions with higher-clock. Company Y employs 2 people, but they work on making a different form-factor for the processor, so it can be embedded in wrist-watches. Company Z employs none and goes out of business because it don't produce anything relevant.

    I don't know, it just make sense to me. Maybe I am not a capitalist... :-)

    Cheers,

    --fred

  • <I>Their in-house coders pour over each microchip and line of code hundereds of times looking for even the smallest bug. </I>

    And in-house coders can do this OpenSource or FreeSoftware ? I don't understand, plus go Dr.PHD CS professiner can also have this class pour over it and learn how it works, and WOW they might even find a bug and help the in-house coders...

  • Well, we've got Open Source hardware in the news. I figure with all the theorizations, I'd toss in my two cents on what this will entail.

    THE BAD:
    1) Expect to see some of the cultic behavior we've seen that has affected Linux's reputation - people jumping on every note on LEON and related technolog as the Ultimate Thing to Save Us from Microsoft. Expect this behavior to be noted by non-OS manufacturers and used against OS hardware.
    2) Expect serious reality cramps when people discover just how much fabricating chips cost. Expect conspiracy theories to emerge.
    3) This is a first step, and there's a lot further to go.

    THE GOOD:
    1) OS processors are at least feasible. Let's face it - this is just cool.
    2) The genie is out of the bottle - the idea is there. It will spread.
    3) Intel has been made a fool of by AMD. Transmeta (associated with Linus Torvalds fortunately) has their new chips. Now we've got this. People are starting to rethink chips, processors, etc.

    Do I think a revolution just started? No, though I expect some people will play it up as such. There may be a revolution, but it won't happen immediately.

    However, a good idea is out there and its physically manifested. I expect good things to come of it - just not right away.
  • Typically it costs few hundreds of dollars. They make money on chips, and sometimes even give software away for free. Even for ASIC's, there are some low-cost alternatives, like Alliance ( http://www-asim.lip6.fr/alliance.html )
  • Now several electronics mfg's can band together to produce chips. The chip making club will still be small just not as small or as expensive.

  • ; there's no reason not to trust open source developers; they're much better than the money-hungry programmers hired by coprporations.

    So why are "money hungry" programmers any less driven than freebie bedroom programmers?

    They are probably more motivated on occasion (money, moolah, ca$h), and may actually have achieved a higher degree of academic excellence to get their high-paying jobs in the first place.

    In saying that, I have been guilty of knocking out the odd Friday-afternoon bit of code!

    (this is my first thing i wrote for slashdot!)

    Congratulations! With a user number like that you must have been lurking for a considerable period of time. Expect the Grammer Nazi at you for that spelling anytime soon ;-)

  • Oh, please.

    Often the open source developer and the "money-hungry programmer" is the same. Where do you think we get the money to make a living from? Here's a hint: We're not the voice in McDonald's box.
  • "How many probes did they lose or mishandle in the last few years? "

    The answer is, the US has lost more probes than the rest of the world has even tried to launch.

    A better question is how many other countries have even launched a probe anywhere?

    Or, how many probes has nasa successfully landed as compared to all other countries on the globe? Of course, the answer is NASA has had more successful such missions than all other countries combined.

  • 3) Intel has been made a fool of by AMD. Transmeta (associated with Linus Torvalds fortunately) has their new chips.

    While I agree (in some regards) that AMD have been producing better desktop CPUs than Intel recently, they are virtually nowhere in the server market yet.

    How you can include Transmeta in the same sentance is laughable ... their products are akin to those old useless Cyrix processors of years ago. They hyped their technology to great heights, and them embarrased themselves.

  • Quite a lot. All your communications satellites didn't get up there on the Space Shuttle, they went up on Ariane or similar rockets. Why? Bcos the Space Shuttle is hugely expensive to run - deliberately so, since it was designed to look attractive to voters and to Air Force generals who are used to aircraft, not to provide the most efficient satellite delivery service.

    Grab.
  • Although I think that the Crusoe had a very poor showing in benchmarks, and dismal performance, it's a very new technology. That, and I don't think the code-morphing technology has really hit the right market yet... The biggest difference between Cyrix and Transmeta is that Transmeta has potential for a new technology, Cyrix was just playing follow-the-leader
  • Maybe you have the specs to make the apps fast, but how's that going to help? Check the article - max speed (theoretical max for the target FPGA, they've still not hit it yet) is 100MHz. Not that there's anything wrong with this - for mobile phones and other apps this is absolutely brilliant news. But the guys in here who say "Hey, I can run my PC on open-source hardware" are demonstrating that they don't know much about hardware.

    So what's the score? Well, it'll be great for embedded processing stuff. Mobile phones, car (and aeroplane) engine controllers, and, yes, space flight stuff - all these can benefit from it. In these applications, the complete reliability of the core, the implementation of a known-good maths processor (Sparc) and the easy access to documentation makes it great. So what's the downside? Simply that it's nowhere near as powerful as any PC less than 5 years old. If you want to use this to run your PC, you're back to 486-land. An FPGA is a general-purpose chip, which means that it can't by definition match the speed of a designed-for-purpose PC chip.

    Grab.
  • One lockup a day due to radiation? Damn, and there was me thinking my PC kept crashing bcos of the shite code it was running! :-)

    Grab.
  • only if you get some very fast FPGA's. Right now I'm working with a Xilinx Vertex II. For a simple Time Of Arrival buffer the max estimated clock is 70 Mhz. I believe that the Vertex II haven't even been completely created yet either.

    You'd be much better off using ASICs so far, but FPGA's are getting faster.

  • Open Source hardware running Free software -- wheee!

    "wheee" indeed... The thing is running at 60 MHz. LEON may be "open-source", but I'll stick to Intel processors that are 10 times faster and probably cheaper, thank you.

  • What is actually open-sourced is VHDL code. That is a language that describes a hardware design. It can be written at various levels, I assume that for a CPU they'd carry it to the lower levels (registers and gates). But to actually get a chip, you've got to compile the VHDL for the particular chip-making processes. This isn't as easy as compiling a C program, it takes considerable human intervention to get a physical fit and correct for various process limitations. And then you've got to create the exposure masks for each step in the process (dozens of them), and finally make a batch of chips. It might take $0.5 million to get the first chip. But that's a few million less than if you created a CPU design from scratch, or licensed one from Intel.

    Notice that the implementations in silicon which have got as far as test are quite slow (down to 35MHz), while the chips that are still in process are estimated at 90 to 150MHz. But maybe a Sparc-compatible at 100MHz running *nix will beat a P-III at 800MHz burdened by Windoze? At any rate, it is one heck of an improvement on the underclocked 286's and 386's that are now the peak of space-rated technology. And since even the best quality checks seem to be subject to blind spots, getting outsiders to pick away at the design it should improve the chances of finding bugs before they pop up in flight. (Blind spots: ESA: A new Ariane rocket blew up on the first launch because a calculation related to velocity overflowed, in software was carried over from an older, slower rocket. NASA: Lost a Mars probe because they forgot to convert pounds to newtons. Intel: Pentium divide bug.)
  • Oh no! The US better get there quick before Mars becomes a Red planet! Doh! Too late!
  • So why are "money hungry" programmers any less driven than freebie bedroom programmers?

    Some reasons spring to mind:

    1. they get paid by the hour therefore recognition is not upon results alone
    2. they are wasting time on petty internal political battles instead of coding
    3. working towards M$ certification is more important than towards a successfull software project
    4. they don't program for the love of it... in fact they may have done an arts degree but did a quickie convertion because computers are where the money is


    They are probably more motivated on occasion (money, moolah, ca$h), and may actually have achieved a higher degree of academic excellence to get their high-paying jobs in the first place.

    Academic excellence != good programmer. Even most of a degree in Computer Science you will never use in industry. I did some really weird and academic modules that were useless. As for high paying jobs, I've found that pay tends to be inversely proportional to the challenge. My choices between jobs have been pretty well divided along the lines of 'interesting' or 'well paid'. I always pick the former as it pushes me more and builds up skills I can always trade in later if I get wife/house/kids/etc.

    Phillip.

  • "They should implement the processor in some sort of non-volatile programmable logic." That would be nice. And it has been done for the Z80 and maybe some other 8-bitters. You take too big of a performance penalty to do this with a 32-bit CPU, so far. That will change, but by then 64-bit CPU's will be standard in desktop computers... So for now, expect to see this implemented as ASIC's (semi-custom chips) and maybe full-custom.

    Also, the cost of compiling VHDL of this complexity to fit into any particular chip is probably in the low six figures. Sorry, but they are going to have to get it right, then be sparing with the upgrades.
  • The LEON being a SPARC compliant chip will it then be able to SMP? Isn't it Linux/SPARC that scales to 64 cpus?

    Even if this chip sounds slow to me (25 MHz) with 64 of them it will start to take off.

    So the question is really _does_ this new processor do SMP?

  • The ESA is a gov't agency, they aren't concerned with competition. I think what they are really hoping for (besides getting enough eyes looking at the VHDL code to prevent anything like the Pentium divide bug from sneaking through), is that there will be sufficient commercial volume to drive the prices down for them. If they kept it closed source, it would simply remain a low-performance, high-cost chip built specially for space applications.
  • "It is also why NASA can succeed in safe, reliable space flight time, and time again"

    I seem to remember a certain recent Mars mission ... it was software failure they say, but in principle it's the same. Perhaps it could have been prevented if the software was public.

    Seriously, why choose one of the two? why not use millions of dollars AND let it be open-source? After all, there's nothing to lose in software reliability by making the source public. The only issue would be keeping it out of competitors hands, and there's little reason for that today.
  • Correction, my bad. That's 2 billion, not 20 billion per the AC's link.
  • At least somebody thought of FPGAs instead of billion dollar plants (and the future ink jet ICs!).

    But you can get free software from Xilinx that will program the Spartan 2 series and the Virtex 300. I'll bet that the LEON could fit in a US$20 Spartan 2 150...
  • What I'm getting at is designing Open CPUs is a great idea. It allows developers to really get inside of the hardware and optimize the hell out of applications, which is a good thing. However, the actual cost to make these CPUs is staggering, and unless a big company put up some big bucks, I don't see it happening in the near future.

    Obviously LEON will only be used by people who need to make specialized, limited-run processors, like ESA. For everyone else, the cost of licensing a proprietary chip will always be offset by manufacturing economies of scale.

    In the Open Source/Free Software dichotomy ("We need a new model of collaboration" versus "Trying to control ideas is wrong!"), LEON is clearly on the Open Source side.

    It occurs to me that Open Source Hardware is not a very new idea. Consider how many Model A Fords are still on the road, 70 years after their brief production run ended. That's because all the parts can be made by simple, well-documented processes.

    __

  • The LEON model also exists in a fault-tolerant version, capable of detecting and correcting single-event upset (SEU) errors in any register or RAM element.

    I wonder.. would that make the processor less susceptable to crashes from other things apart from a cosmic ray 'twiddling' a bit at random?

    Like in Windows for example, if the OS or a program screws up a register or does something else stupid, do you think this chip could correct that error by itself instead of crashing?

    Be amusing that.. Open source hardware to 'fix' bugs in Windows. We all know the ultimate 'service pack' for Windows is a Linux distro.

    --
    Delphis
  • I was speaking to a co-worker the other day about Sun's UltraSPARC III processor and he was telling me about CPU manufacturing in general.

    To actually create a fabrication facility to make CPUs, it takes about 20 billion dollars. $20,000,000,000 dollars. That's more than most companies can afford. Even Intel couldn't make very many new fabrication plants.

    Your coworker is prone to exaggeration.

    AMD recently built a very large, first class, state of the art facility in Dresden, Germany, for roughly 20-25% of the figure he cited.

    Assuming one didn't need a very large, first class, or state of the art facility, I would guess that a whole lot less could be spent to get a useful fab up and running. Perhaps even less than 10% of what AMD spent.

  • Fabs don't crank out chips in "48-72" hours, not even in days, Atmel's CS locations take anywhere from 6-18 _weeks_ to run a set of wafers through. I don't know about European fabs, but they can't be that much different.

    Puff,I work for Atmel, but I don't speak for them.
  • by Guppy06 ( 410832 ) on Monday March 26, 2001 @09:34AM (#340000)
    "A space flight is a dangerous and EXTREMELY complicated process, and if a single component fails, or produces an error, the results can be catastrophic. "

    I don't know the rules that the ESA has to live with, but if they're anything like the rules the DOD imposes on rocket launches, if something goes wrong, you just blow up the rocket. Even the shuttle's SRBs have the equivalent of a really long stick of dynamite to make sure that, in case of an accident, no pieces bigger than my hand or so would ever reach the surface.

    "NASA, in the USia (sic), spends literally millions of dollars designing the space shuttle's computing system from the ground up. "

    We're talking about the ESA here, not NASA. There are only two countries out there that have manned spaceflight programs, and PRC is much closer to being number 3 than any European nation or group. In my opinion, the Japanese will have manned space flight before the Europeans.

    Today's useless fact: After the US and Russia/CIS, the country with the most people that have gone into space is Canada.

    If you're going to compare the uses of these chips to an American launch platfrom, I'd use the Titan III, or maybe the Delta 2, but definately not the STS.

    "Their in-house coders pour over each microchip and line of code hundereds of times looking for even the smallest bug. This superior attention to detail is not possible using the limited resources of the open-source method. "

    Tell that to the NSA. So far, it seems they're doing pretty well at scrutinizing Open Source operating systems (ie. GNU/Linux), and seem to be on the verge of making it the most secure modern operating system hands-down.

    Open Source means that the hacker sitting in his mom's basement with a computer and a bag of Fritos can (legally) take apart and tinker with the innards. It doesn't REQUIRE you to fit that stereotype, though. If it did, then what does that say about IBM's efforts to run Open Source software on their mainframes?

    "It is also why NASA can succeed in safe, reliable space flight time, and time again, while other space programs are struggling."

    I think you're confusing "attention to detail" with "multi-billion dollar infrastructure set up in the paranoid 50's and 60's to support the Saturn V." As long as they had the money to pay for the rockets, you could give Brevard County, Florida to any country in the world and they'd be able to build and launch super-heavy lifters to their heart's content.

    There's so much there that even Florida's Spaceport Authority (part of the Florida Department of Transportation) owns and maintains their own launch facilities on the Cape (Launch Complexes 20 and 46, I believe). http://www.spaceportflorida.com [spaceportflorida.com]

    Getting into space doesn't require millions of dollars. Currently, all it requires is hundreds of dollars in model rocketry parts (and permission from the DOD and FAA to launch it). Sure, getting into orbit is even trickier, but I know of a team at Embry-Riddle Aeronautical University in Daytona Beach working to do just that with a 6' rocket. The only expensive part of the equation is trying to put something the size of a Mack truck into orbit.

    ... and that concludes my rant.

  • Satellites (communications, spy, whatever) are, by definition not probes. By definition, satellites have a closed orbit, probes do not. Hubble is a sattelite, Pioneer is a probe. NASA has most certainly launched more probes than any other country. Excluding possibly Russia, more than all other countries combined. Yes, other countries have launched numberous satellites, but compared to a probe, a satellite is trivial. You try even hitting Mars, let alone landing on it gently. Yes, Mars is a nice big barn door, but that barn door is on the other side of the country. And remember, in space, nothing moves in a straight line.

    Bill - aka taniwha
    --

  • I also want to add Hubble to the counterexamples. First they designed and manufactered everything according to the normal processes, and that resulted in an incorrectly shaped lens. To resolve this, they published the problem, and asked if anyone could come up with a solution. They then went through all the proposals, and found one which worked. This was a very open source type of approach, which worked where a meticulous check by the "in-house coders" failed.

    For those interested, it was the testing instrument of the lens which failed, so the lens was polished correctly according to the specifications of the instrument. So one could say that the testing procedure was flawed, and not the quality of the testing itself. I guess this is a difference, that when you do something "in-house", it's difficult to get fundamental criticism, i.e. out-of-the-box criticism. In open source contexts, you almost always have someone pointing out fundamental flaws, because they are not afraid of criticizing "the boss".

  • Try free or $55. See www.fpgacpu.org/index.html#010311

    See also my Circuit Cellar 3-part article series, Building a RISC System in an FPGA, at www.fpgacpu.org/papers/xsoc-series-drafts.pdf, and see my DesignCon 2001 paper, Designing a Simple FPGA-Optimized RISC CPU and System-on-a-Chip, at www.fpgacpu.org/papers/soc-gr0040-paper.pdf.

    Jan Gray, Gray Research LLC
    FPGA CPU News: www.fpgacpu.org
  • On the other hand, NASA can't tell the difference between feet and meters (they've botched that one up more than a few times), rely on antique systems because hardware manufacturers can't divide, and haven't the money to build a next-generation of vehicles (X-33, X34, the blended-wing aircraft, to name but 3 projects axed).

    The ESA is not much better, having blown up at least one Ariane rocket because they'd swapped a + for a -. But at least, that was unmanned and merely cost a lot of people a lot of money, rather than their lives.

    Open Sourcing means more eyeballs and more checking. Checking that space agencies (running on VERY VERY tight deadlines) can't afford to do. Sure, the patches will need to be examined ruthlessly, but with exhastive testing possible via simulations on every PC and its cousin, deliberate or accidental bugs can be all but wiped out.

    Don't trust an agency because it's big. Trust it because it's done something to merit it. The NSA, for example, has probably done more for it's image in the past few months, via SE Linux, than it has ever achieved in the rest of it's existance. The ESA's move could, likewise, turn this barely-known, insignificant launcher of commercial satellites and the odd scientific one, into a major space research organization. COULD. Not will. There's a long way to go, yet, but getting known is a good step in the right direction. Getting known and aided by Nerdius Technomaximus is even better.

    P.S. Anyone want to bet on when Linux'll get a self-modifying architecture?

  • If I recall, the US space shuttle runs on something like a dozen underclocked i286's, each processor with something like 5 way redundancy.

    Your recollection is wrong.

    The shuttle uses five IBM AP-101S computers. They are not microprocessors. Architecturally, they belong to the IBM 360/370 family of computers. See http://www.fas.org/spp/civil/sts/newsref/sts-av.ht ml [fas.org].

  • The shuttle's GPCs are programmed in HAL/S (high-level aerospace language/shuttle).
  • I don't know the rules that the ESA has to live with, but if they're anything like the rules the DOD imposes on rocket launches, if something goes wrong, you just blow up the rocket. Even the shuttle's SRBs have the equivalent of a really long stick of dynamite to make sure that, in case of an accident, no pieces bigger than my hand or so would ever reach the surface.

    The purpose of a range safety system is not to "blow up the rocket", it is to terminate thrust, allowing the rocket, or pieces thereof, to follow a ballistic trajectory into a safe impact zone. That is what the linear shaped charges do to the SRB. They "unzip" the motor casing, causing the internal pressure and thrust to drop to near zero.

  • NASA, in the USia, spends literally millions of dollars designing the space shuttle's computing system from the ground up. Their in-house coders pour over each microchip and line of code hundereds of times looking for even the smallest bug. This superior attention to detail is not possible using the limited resources of the open-source method.

    But in the last few years, especially with the emphasis on Cheaper/Faster/Little/Yellow/Different/Better, NASA, as well as most other government-sponsored research projects, has put more emphasis on COTS, or Commercial Off-The-Shelf components.

    It's far easier to purchase an op-amp from a company than design+fab your own. That's why so many components have two ratings - commercial and military. The military ratings are designed to meet stricter environmental and performance conditions.

    This is probably true in the realm of processors too. The only processor I know of specifically that NASA has used is the 8085 in the Mars Pathfinder. I think this was chosen mostly for its low power consumption. Of course it's ancient now, but when the program began this was a decent well-known reliable processor.
    __ __ ____ _ ______
    \ V .V / _` (_-&#60_-&#60
    .\_/\_/\__,_/__/__/

  • Alright! Now as soon as it is proven to be reliable,building the computers for a starship might not be so hard.;)
  • We're talking about the ESA here, not NASA. There are only two countries out there that have manned spaceflight programs, and PRC is much closer to being number 3 than any European nation or group. In my opinion, the Japanese will have manned space flight before the Europeans.

    A large reason for that being that Europe totally lacks suitable launch grounds(you need lots of empty ground east of your launch spot to let the various stages of the rockets fall down) If Europe tried launching rockets the parts would fall down somewhere near Moscow. I dont think the Russians would be to happy with that.

    Europe certantly have the economical and technical means of manned spaceflight. It's a political problem, like much else..

    -henrik

  • Seems pretty open source to me, they even have a shot [gaisler.com] of the sim running on KDE.... pity I use GNOME
  • by jd ( 1658 )
    Those Intel processors would curl up and die in space, high-radiation areas (such as 3-Mile Island or Chernobyl), or indeed anywhere with exotic conditions.

    If I was in some extremely hazardous environment, and relied on a computer to keep me intact, I'd go for one that'll do the job, whatever speed it ran at.

    If you'd prefer the flat-out MIPS rating, go for it. Your choice. Just don't go into space, the Irish Sea, anywhere that's glowing, or within a few hundred miles of any EMP stuff. You'll be just fine. Bored witless, but fine.

  • OpenBSD is done by a group of volunteers (for the most part) and the quality of the auditing isn't anywhere near high enough to where you should put lives on the line.

    Anyone who reads The OpenBSD Errata [openbsd.org] could tell you while they do a good job for an open source project, I certainly would not want any lives riding on the security of it.

  • I am a capitalist.

    Not from what I infer, based on your views. "From each according to his abilities, to each according to his needs," sounds like something other than Capitalism.
  • ...the compiler is gnu and the money isn't there since the spec is open anyway.

    Not so. There's plenty of money in porting gcc, the linux kernel, &c. to different architectures. Cygnus Solutions does just that and makes money at it; my company, MontaVista Software, has been responsible for getting Linux to run on several boards -- and free software or no, there are companies who are more than happy to pay us to do it.

    Presumably this works for hardware too.
  • A large reason for that being that Europe totally lacks suitable launch grounds(you need lots of empty ground east of your launch spot to let the various stages of the rockets fall down)

    Actually Europe has better launch grounds than the NASA (not geographically as in located in Europe, but as in being owned by the ESA). It's in French-Guyana (South America), opens over the ocean and is only slightly north of the equator, unlike NASA's launch grounds.

    If they want to shoot something into an equatorial orbit (e.g. commercial satellites to geostationary) they can save a lot of fuel by exploiting earth's rotation speed, which is highest on the equator. And the Ariane rockets aren't exactly unsuccessful, commercially or otherwise... (in fact, the ISS will get supplies through Ariane 5 rockets, IIRC).

  • Reading the eetimes article I saw that they got the processor running at like 18Mhz or something.

    How fast do you people think that this processor could run ... like in the future once it gets developed further? How would it measure up to, say a Pentium III or a SPARC or whatever.

    Open Source processors sound like a pretty cool idea and I hope it gets popular, but for now (or maybe 5 years from now), it seems like they'd fall way behind (speed-wise)traditional ones ... but that's speaking from a consumer angle. My understanding is that this would be a universal platform for the research and [aero]space community ... where shared development and not necessarily processor speed is the main issue.

    -Christian

  • The ESA is not much better, having blown up at least one Ariane rocket because they'd swapped a + for a -.

    If you refer to the Ariane 5 disaster (at least I don't know any other that got destroyed due to a software problem), then the effects which required it to be destroyed were a bit more complicated. The reason was however relatively simple. They used the proven Ariane 4 software, but missed a function which was not able to handle the wider input range caused by the more powerful rocket, which then subsequently reported errors that could not be handled or just gave wrong results (don't remember which) causing the rocket to stray excessively off-course, which then had to be destroyed.

    At least this was not in regular operation but on the maiden voyage of the system (still with payload and expensive, of course), got fixed and they did a bunch of flawless Ariane 5 launches.

  • There are only two countries out there that have manned spaceflight programs

    To do some nitpicking, a few more countries have manned spaceflight programs, but only USA and Russia have the vehicles to launch people into space. The others buy space flights for their astronauts from these two.

  • So why don't they just look over this microchip before using it? The fact that it's Open Source should only make the process easier.
  • And I can make a hat, a pterodactyl...
  • Please understand that LEON is developed to solve the specific problems of producing processors for space applications - it is NOT targetted for any commercial product. ESA has provided a simplified version the model under LGPL to increase the user base and hopefully find any remaing bugs.

    I agree that space hardware is rigorously tested before being used, but after having designed space computers for more than 15 years, I have not yet seen a processor that does not have a list of (known) bugs. The bugs are usually handled by software patches, which is perfectly normal also for ground use. Just take a look at the errata list of current Intel processors!

    Also, LEON does not in any way compare to modern processors such as PIII, K7 or PPC. LEON targets embedded applications and SOC systems where 20 MIPS usually is enough. Our current (0.35 um) demonstrator provides 50 MIPS, and the first product will reach 100 MIPS. This is enough for us but not very useful for workstation applications.

    Oh yeah, and the slashdotting of my website has just blown my web-hosting bandwidth limit for this month (1.5 Gbyte) ... :-)

    Jiri Gaisler
    Gaisler Research
  • Europeans don't lacks of launch spot, they 'got Kourou, this site is in South-america, on a french territory (Guyana), and this is a good spot, as this is on equator (energy less than Florida launch spot). Make me laugh with your rocket falling down to Mockba. Are you kidding ? Second point, Ariane 4 is the best rocket this days to launch whatever project to space, a titan rocket is less powered than Ariane 4, usable weights can be compared. Don't, don't believe that europeans lack of somethin' in this area, except from a shuttle.

The unfacts, did we have them, are too imprecisely few to warrant our certitude.

Working...