Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware Hacking Build

'How I Compiled My Own SPARC CPU In a Cheap FPGA Board' (www.thanassis.space) 83

Long-time Slashdot reader ttsiod works for the European Space Agency as an embedded software engineer. He writes: After reading an interesting article from an NVIDIA engineer about how he used a dirt-cheap field-programmable gate array board to code a real-time ray-tracer, I got my hands on the same board -- and "compiled" a dual-core SPARC-compatible CPU inside it... Basically, the same kind of design we fly in the European Space Agency's satellites.

I decided to document the process, since there's not much material of that kind available. I hope it will be an interesting read for my fellow Slashdotters -- showcasing the trials and tribulations faced by those who prefer the Open-Source ways of doing things... Just read it and you'll see what I mean.

This is the same Slashdot reader who in 2016 reverse engineered his Android tablet so he could run a Debian chroot inside it. "Please remember that I am a software developer, not a HW one," his new essay warns.

"I simply enjoy fooling around with technology like this."
This discussion has been archived. No new comments can be posted.

'How I Compiled My Own SPARC CPU In a Cheap FPGA Board'

Comments Filter:
  • by paralumina01 ( 6276944 ) on Sunday October 20, 2019 @01:43PM (#59328352)
    It's nice to see people still find joy outside the conventional.
    • Agreed, this is really cool stuff!
    • by ttsiod ( 881575 )

      Thanks for sharing! It's nice to see people still find joy outside the conventional.

      My pleasure! It's nice to see so many people reacting positively to the post.

      • Thanks for sharing this. I think this also demonstrates an interesting thing:

        The SPARC CPU is the hardware used in ESA space missions, so not a toy.

        You *compiled* this hardware. From source code.

        This must leave the people demanding that "software patents" be banned very, very confused.

        • This must leave the people demanding that "software patents" be banned very, very confused.

          Care to explain?

          • by raymorris ( 2726007 ) on Sunday October 20, 2019 @08:18PM (#59329238) Journal

            Would you agree that Intel has come up with a few inventions since the 1980s, that the Core i7 is different from the 8086 CPU?

            The people who believe in "software patents" say anything that can be expressed as computer code can't be an invention, and thus shouldn't be eligible for a patent. Of course anything that can be made with 3D printing has to first be coded as RTL - computer code. Any invention that can be 3D printed can exist as software, and therefore, according the the "software patent" myth, can't be an inventiin at all. Or at least, it can't be deserving of a patent, since it can exist as 3D printer STL code.

            In this story, it was mentioned he *compiled* a cpu - from source code. The same way you would compile Firefox. He used compiler tools to render a silicon representation of whatever inventions existed in the source code.

            Because CPUs are burned from source code, whatever clever new ideas are in the silicon must first be in the source code for the cpu. Whatever inventions exist in the Core i7 you buy at the store, those inventions were first in the source code i7.

            If there are any inventions in new types of CPUs, they are "software inventions" - they existed in software form before they ever existed in any other form. Either there have been no new ideas ever invented for CPUs, or the concept of "software patents" is bogus - there is no such thing as a software patent.

             

            • There is a third option.
              Maybe the "source code" is actually the specs for making a physical object. You know, like a CPU.

              Because CPUs are burned from source code...

              They're not really though. They're made from silicon and whatnot.

              • >> Because CPUs are burned from source code...

                > They're not really though. They're made from silicon and whatnot.

                Read the article you're commenting on, or maybe Google "Verilog". There aren't people with tiny X-Acto knives cutting 14nm lines into silicon chips to make caches. They are made essentially the same way as a CD - burned, after being compiled from source code

                • Ps, guess how Linux for amd64 came out before any physical chips did? By running Linux on the x64, prior to rendering x64 as "hardware". It was a fully functional cpu, in use by developers, before there was any x64 silicon.

                • They are made essentially the same way as a CD - burned, after being compiled from source code

                  They are not compiled, they are synthesized. Verilog is not executable code, it's a description language. When you write 'if' in Verilog, you mean: "put a mux here".

                  You could write a document describing the design of a typewriter, but someone manufacturing that typewriter according to those designs would not be "compiling the source code".

                  • Verilog is very much executable code; that's the whole point of Verilog: that you can execute it as an interpreted program that simulates a digital circuit (and analog, too, if you're willing to put in a lot of work), and that you can also use it as part of the specification of a hardware digital circuit.
                    • Not quite. Verilog describes hardware. You could certainly simulate that described hardware on a computer during the design, but you're not executing the Verilog.

                      You can also simulate the aerodynamic properties of a car. That doesn't make the design documents of the car "executable code".

                      (some parts of Verilog are executable, but those are not synthesizable, and only used in test benches)

                  • Putting aside the fact that interpreted programming languages and architecture-independent languages ARE programming languages* (including Verilog) try making a blueprint to build a typewriter that doesn't include the clever new ideas (inventions) that are part of the typewriter!

                    So supposing Verilog (and C, btw, which can be compiled into silicon) couldn't be used for executable programs, if they were "just the spec for how to build it"), they would STILL need to contain the clever idea, the invention.

                    Btw,

                    • Verilog ... oh you think it's not a programming because it uses a virtual machine to run, like most modern languages do.

                      No. Verilog doesn't "run" at all. It's a hardware description in a format that, to some degree, resembles a programming language. You could take a procedural description in Verilog, and without too much effort (less than 1% of the total synthesis effort), turn it into a schematic diagram or a text file containing a netlist. And nobody would consider a netlist a "program" or "software".

                      The original Verilog design is just like a netlist, but in a more compact format. Instead of creating a bunch of flip flops

                    • by AmiMoJo ( 196126 )

                      This is a lot like the "illegal numbers" argument. Any file is just a long binary number, and certain files are illegal, therefore certain numbers are illegal.

                      The law tends to pay little attention to such arguments and instead focuses on the common way that the file is used, e.g. a JPEG displays an image on the screen.

                    • Furthermore, if somethings is generated through some more complex means, the result of a hardware synthesis process is not the process itself, just like a sorted array of integers isn't a sort program. So this confusion of hardware design methods with the end results seems unfortunate on that level as well.
                    • > No. Verilog doesn't "run" at all.

                      iverilog. :)
                      Also funny thing. When you press the "run verilog" button on edaplayground.com, it does that.

                    • Yes, the "illegal numbers" argument, saying that a jpeg is just a number and nothing else is silly. It's silly because it pretends that data doesn't have meaning, that there isn't a picture there. They would pretend, contrary to common sense, that a virus isn't a dangerous thing; "it's just bytes", they'd say, ignoring the fact that those bytes make up some THING.

                      Similarly, the "software patent" nonsense pretends that the bytes of software are just random bytes, that they don't comprise a meaningful thing

                    • Neither an invention nor a process.
                      you're effectively claiming that a patent application can be patented.
                    • Because the useful new idea, the invention, could be in the patent? Is that what you're suggesting?
                      If I'm right that patent covers a novel (new), useful idea, not one particular instance of a machine, that patentable idea could be in the patent? Is that your argument?

                      .
                      .
                      .
                      .
                      .
                      .
                      It's a legal requirement that the invention be in the patent. So yeah, it always is. One particular elevator in building C isn't the invention; the idea, the unique new design for elevators, is the invention.

            • anything that can be expressed as computer code can't be an invention, and thus shouldn't be eligible for a patent. Of course anything that can be made with 3D printing has to first be coded as RTL - computer code

              One would think that on /., people would not haphazardly abuse the word "code" to refer to what is clearly data, not algorithms.

              Because CPUs are burned from source code, whatever clever new ideas are in the silicon must first be in the source code for the cpu.

              That is demonstrably false. For example, new materials for transistor structures, or the structures themselves, such as FinFET transistors, are obviously not in "the source code for the CPU", since they're lithographic and material science advances. In addition to that, some new ideas may be in the tools for automated generation of layouts and not in the high-level specifications t

              • Your sig made me laugh. I had to look up the verse - not what I was expecting. That reminds me of when our youth pastor, who was also the lead pastor's son, was supposed to read from Proverbs 5 in front of the congregation. He skipped over 5:18 and mumbled 5:19. :)

                "Tools for automated generation of layouts" etc aren't in code for the cpu - and aren't in the silicon rendering of it either. They aren't part of the cpu, period.

                The difference between Broadwell and Skylake CPUs isn't the fets. The useful new

                • But at the same time, the differences between Broadwell and Skylake are practically unrecognizable for any real-world metrics.
                • Also, you seem to be hang up a little bit on the fact that the CPU has some kind of description. I'm saying that turning the description into an actual working circuit in itself is a field ripe for improvement, said improvement manifest in the final results but obviously not in the oriiginal high-level model. You're saying that "they aren't in the code for the cpu" - and therefore aren't "part of the CPU"? Maybe from your POV but definitely not from mine. The CPUs would definitely show any such improvements
        • He didn't "compile hardware", he compiled a bitstream program for a programmable logic circuit. I'm fairly certain that ESA doesn't actually run their CPUs as simulations on programmable logic circuitry.
          • He did not SIMULATE the cpu, he CREATED it. It's not an emulation of a cpu, it is a cpu.

            Specifically, he created it the same way Michaelango created the statute of David - by getting rid of the unnecessary parts of the raw material. Just as Michaelango created a drawing specifying what the statute should look like, then used tools to apply that render that design in stone, ttsiod created code specifying what the cpu should be like, then used tools to render that design in transistors.

            His cpu is real just a

            • He did not SIMULATE the cpu, he CREATED it.

              Is he really the author of LEON?

              Specifically, he created it the same way Michaelango created the statute of David - by getting rid of the unnecessary parts of the raw material. Just as Michaelango created a drawing specifying what the statute should look like, then used tools to apply that render that design in stone, ttsiod created code specifying what the cpu should be like, then used tools to render that design in transistors.

              He didn't use lithography, actually. So he didn't "get rid of the unnecessary parts of the raw material". It's more like loading a photo into a digital photo frame.

              His cpu is real just as the statue is real stone, or any 3D-printed part is a real part.

              It's more like real in the sense in which the text of a novel is real.

              Maybe they are unaware that any invention that can be 3D printed must first be expresssed as code - as software.

              This is trivially a false claim, since all you need it a description of a set. Not all descriptions of sets need to be algorithmically compressed.

              Therefore there can never be any patentable inventions in new generations of CPUs?

              That doesn't make any sense. There's tremendous amount of patentable inventions in new generations o

            • by gtall ( 79522 )

              It depends on your definition of a CPU. If a CPU to you is merely its instruction set, then yes you can compile an interpreter for that instruction set. However, CPUs are much more than that. If you don't know the difference between an ASICs design and an application for an FPGA, then you don't understand hardware.

              • > If you don't know the difference between an ASICs design and an application for an FPGA, then you don't understand hardware.

                Let's talk about the difference. Which one do you think isn't real? Because you run instructions with either. As I mentioned before, Linux for AMD64 CPUs was running before there was any kind of bespoke AMD64 dies. So any inventions in amd64 were already in operation before there were any dies.

            • Maybe they are unaware that any invention that can be 3D printed must first be expresssed as code - as software. So they are actually saying that anything that can be 3D printed isn't an invention.
              Data, not software.

              • Every computer you've ever seen in real life is a von Nueman machine. A von Nueman machine is a machine that takes as input data bytes telling it what to do. Just like STL is data bytes telling it what to do. That's why buffer overflow is a thing - because instructions are data.

                • Nope?

                  While data can be instructions and with malformed data you can produce buffer overflows, it does not make it code. Or every paper drawing is code, too.

                  • Nope? You figure you're just going to "nope" the von Nueman architecture, and therefore every PC ever produced, out of existence?

                    Come on, you're smarter than that. Smart, just REALLY, really bad at saying "hmm, I hadn't thought of it that way", and thereby learning something.

                    • I think he "noped" your claim that "that's why buffer overflow is a thing - because instructions are data". No, it's not, because even a Harvard machine program can have this kind of vulnerability if it allows its data to be rewritten with other data.
                    • True in a Harvard architecture the bytes which are the operating instructions can't be overwritten by bytes that get operated on. As we know, the *primary* danger of a buffer overflow is that the operands can overwrite the operators.

                      If one wants to point to the Harvard architecture, it makes my point even more clearly. Harvard architecture distinguishes between instructions, bytes (originally on punch cards) which tell the computer what to do, vs bytes that the computer does something too. STL tells the

                    • No it does not.
                      Harvard architecture simply means you have two caches one for data and one for code. And possibly distinct busses for code and data. The problem with buffer overflows and other things is memory, not processor architecture.
                      And: a Harvard architecture processor is still a von Neumann machine. (Facepalm)

                    • Comment removed based on user account deletion
  • Murphy's law (Score:4, Interesting)

    by maxrate ( 886773 ) on Sunday October 20, 2019 @02:08PM (#59328406)
    I had 20 PanoLogic units, brand new, collecting dust in my office for the past 5 years. I -just- recycled them 2 weeks ago thinking there was nothing to be done with them. That said, the Pano units were really nice. I throw out so much amazingly good stuff routinely. I could sell on eBay, but who has time for all that.
    • I -just- recycled them 2 weeks ago thinking there was nothing to be done with them. That said, the Pano units were really nice. I throw out so much amazingly good stuff routinely. I could sell on eBay, but who has time for all that.

      Too bad you don't have any friends, you could have given them to someone else who would have appreciated them.

      • by maxrate ( 886773 )
        I didn't think they were useful for anything. I have a lot of friends, but few geek friends. I would have been delighted to have donated them to anyone who would have liked one. You might make more friends yourself if your comments were a little more polite (saying I have no friends).
        • I've tried being nice, and I've tried saying what I mean, and I get more respect with the latter — especially around here. I speak differently to different audiences, and this one doesn't respect polite as much as it respects clever. Welcome to Slashdot!

          • by maxrate ( 886773 )
            You're rationale doesn't excuse your unsolicited & unwarranted negative comments directed towards me. Slashdot isn't the problem - people like you propagating pessimism and abysmal attitudes on Slashdot are the problem. I feel sorry for you. I hope things in your life improve and that you and your family are well. So long.
            • Slashdot isn't the problem - people like you propagating pessimism and abysmal attitudes on Slashdot are the problem.

              Pessimism? Really? That's the problem?

              You being too lazy to put stuff that others could benefit on eBay or CL or freecycle is a problem. It's wasteful. Our biosphere cannot sustain your wastefulness.

          • With practice, it's possible to do both 'what I mean' and 'nice' at the same time. True, kind, necessary.

            Have you considered that chasing the respect of Slashdot is actually harming your ability to effectively achieve the necessary part?

        • If it comes up again check with local high school and college programs. I'm sure someone would love to get their hands on that sort of stuff for educational projects.

          I run a robotics team at a local elementary school and we are always strapped for time/money/materials.

          I've been intending to do a separate group to work on analog robots, but time and a supply of appropriate parts are limiting factors. A box of FPGA chips would be lost on me, but I know of at least a couple of high school programs that could

          • by maxrate ( 886773 )
            Where about are you located? I always have 'stuff'. I figured sending the items for e-recycling was a good start, but I'd rather see folks learn and benefit from anything I have in excess. Lesson learned for next time! I appreciate your suggestion and will be cognizant of this going forward.
            • I am in South Florida. Our county has a Stem coordinator that handles all of the programs at the various schools in Broward County.

              If you have no affiliation, that person or equivalent in your school district would probably be the best place to start. Or the local community college. Our community college system is tied in to the public schools with crossover courses - our high schools have a program where seniors can graduate with their high school degree and their associates degree at the same time. I'

    • Yeah who has 10 whole minutes to create an eBay listing.

  • MiSTer FPGA (Score:4, Informative)

    by Luthair ( 847766 ) on Sunday October 20, 2019 @02:14PM (#59328418)
    There is actually an open source project focused on using an FPGA to emulate console & arcade hardware - https://github.com/MiSTer-deve... [github.com]
  • Their toolchain is a fucking joke. It's a monumental disaster. I have no idea who is maintaining (and i use that word hesitantly) it, but I'm fairly certain they are indistinguishable from an infinite number of mildly retarded monkeys.

    Xilinx: you guys suck ass and I'm glad I'm done with the FPGA industry in my current career.

    • Could be worse, at least it's not Node.
      • by nyet ( 19118 )

        >Could be worse, at least it's not Node.

        Truth. Node maintainers are indistinguishable from an infinite number of severely retarded monkeys.

    • That's sad. Years ago, I did several CPLD designs with ISE Design Suite and Webpack and found it pretty straightforward and painless.
    • by gtwrek ( 208688 ) on Sunday October 20, 2019 @03:44PM (#59328662)

      I work closely with many of the Xilinx developers. They've got some really bright folks working on some exciting things. Their back end tools are quite good these days.

      The problem is they've (long) suffered from extreme tunnel-vision. Their envisioned way of solving things is the only way of solving things. Any variance on "Their way" is just a customer being troublesome.

      And the Xilinx way of doing front-end design still is solidly wedged in the 80s - one designer running a schematic capture like tool. Then pushing GUI buttons through the implementation steps... Sigh.

      Xilinx management also should also apply the ban-hammer to any new file types, and rely on industry standards more. MHS, HDF, XCIX, DCP, BD.TCL, NCD, NGD... The list of Xilinx filetypes could fill multiple Answer Records.

      • Sounds like the HURD team. If it ain't invented here, it ain't real...
        • by nyet ( 19118 )

          At least the Hurd team understands the importance of SCM and deterministic tools that don't require a UI.

          • by gtwrek ( 208688 )

            At least the Hurd team understands the importance of SCM and deterministic tools that don't require a UI.

            This. SCM for Xilinx was an afterthought in their newest tools (Vivado). They've tried about 5 iterations of recommended methodologies to try and back fit SCM onto current (UI) based flows. None of them have worked; breaking even the most basic SCM fundamentals: human-editable source files, branching, merging, etc...

            • by nyet ( 19118 )

              It's astounding to me because this was the case as long as *15 years ago*. Why on earth is anybody saying Xilinx has "really bright folks" working for them and why does anybody take them seriously?

              What a fucking joke. Sorry Xilinx, you guys are fucking retards.

      • by nyet ( 19118 )

        I've never been impressed with any of their tools they tried to port to unix. If they have smart software devs, they definitely don't understand anything but Windows.

      • by AmiMoJo ( 196126 )

        As someone who just read a Verilog book and is looking for a good place to start with real hardware, what would you recommend?

    • by dohzer ( 867770 )

      People always say that Xilinx has better devices, while Altera has a slightly better toolchain.
      Not sure how the Intel purchase will (has?) change(d) that.

    • I tried Vivado for a couple of days. First day I started with a tutorial, following all the steps, and it worked. Next day I was making some modifications to the design in the GUI, and then I got some obscure error message out of the bowels of the synthesis, with no fucking clue how that error messages was related to anything I had done. It took me hours of fruitless googling, going through generated code, and messing with GUI, until I finally gave up and started from scratch again. This time I got no error

  • The open source way would be to get the toolchain source from Xilinx to fix all the mindbogglingly idiotic things in it, and publish the whole thing on github/gitlab.

  • by BAReFO0t ( 6240524 ) on Sunday October 20, 2019 @02:32PM (#59328452)

    MHz, BogoMIPS, Crysis FPS, SPECint/fp, I don't care.

    • Reading The Fine Article, it appears that a clock rate of 50 MHz has been achieved. That seems reasonable.
  • CPU cores for FPGAs have been available for years. I got into FPGAs in 2011 (as I co-developed the first open source Bitcoin mining hardware) and the "soft CPU" scene was already pretty established. For example, the Altera kit I got that summer included specific instructions and ready-made tools for making an embedded system with a CPU.

    Also, realtime raytracing isn't some magical feat per se, I've coded my own examples [instagram.com] in plain OpenGL. Of course, the "realtime" aspect depends on the scene and the detail

  • This guy knows a new COD is gonna drop, right? He can start doing something productive with his time.
  • Anyone that finds this interesting that hasn't already should read up on the MISTer project. It uses the Altera DE10-Nano FPGA dev board that allows it to run a very large number of retro computers, consoles, and arcade machines.

    The project created a framework to standard a way to load different FPGA cores, and how IO works. This makes trying out a new system as easy to drag and dropping a compiled core and a disk image on an SD card or network share and you are up and running versus dealing with re-imaging

  • It describes a shocking or perhaps not so shocking non-cromulent number of thoroughly encrapidated layers Xilinx added. Innovation through accretion.

    Ah, the joys of slow moving, legacy laden behemoths. If you understand his humor, you belong on /.

"I got everybody to pay up front...then I blew up their planet." "Now why didn't I think of that?" -- Post Bros. Comics

Working...