Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Software Hardware

Crossing the Divide From Software Dev To Hardware Dev 105

First time accepted submitter szczys writes "Quinn Dunki spent decades developing software before she fabricated her own 6502-based computer. Here she talks about crossing between software and hardware (or the other way around) and why this is easier today than it has been in the past."
This discussion has been archived. No new comments can be posted.

Crossing the Divide From Software Dev To Hardware Dev

Comments Filter:
  • tl;dr "Software is a bit like hardware but hardware is less virtual."

    Am I missing some point here? Maybe this person has achieved something I don't know about that makes their message more relevant than I can see.

    • by Anonymous Coward

      It's tough to be negative about someone who appreciates the 6502, but there is a huge difference between putzing around with discrete components and DIPs vs true hardware engineering these days.

      Stuff that was hard awhile ago is definitely easier now, but technology has moved on...professional hardware designers are doing much more complex stuff.

      • by Joining Yet Again ( 2992179 ) on Friday October 18, 2013 @08:42PM (#45171265)

        Yeah, nowt wrong with playing with a 6502 - I grew up on them - but building a 6502-powered toy requires only a few tens of dollars, moderate skill with a soldering iron, and the ability to vaguely follow any of several dozen short articles on putting together a simple 8-bit machine.

        Of course it's cool to build your own hobby quality 6502-based machine. And there are probably lots of EEs who would fail at such a project without assistance - not because it's wildly complex but because it's sufficiently esoteric. But that doesn't mean you're qualified to pontificate on the transition from pro s/w to pro h/w development just because you've built such a machine. The tools, goals, risks, breadth, imagination, etc. put into a hobby product are quite different from that put into a pro product.

        To take another example, with my ham radio hat on I get involved in lots of cool projects which would never make for a saleable product, but which I'd never get the chance to do professionally anyway, because relevant demand is satisfied for the average consumer in a very different way (e.g. almost everyone is happy with relying on a huge amount of private and political infrastructure to communicate globally).

        • The biggest barrier to doing cool stuff NOW with homebrew hardware is DRM. Or, more precisely, the fact that any SoC infected by HDMI or the ability to do hardware-accelerated h.264 is going to be encumbered by viral licensing terms that make it nearly impossible for anyone smaller than Logitech to get their hands on real sourcecode and unredacted datasheets for the best chips out there... or even the ones that are 17 steps down from the top, and so ghetto, not even $7 media players from China embedded in H

          • Is the lack of documentation hampering your development? You wouldn't have much of this high end technology if had companies not sunk development money into it and licensing to recoup their costs.

            • by Anonymous Coward

              Oh quit whining! If you expect me to give a shit about those poor little companies who are only trying to recoup their cost by bilking their uses and hiding their specs? Fuck you, asshole.

            • Progress happens despite the excesses of capitalism, not because of them.

              Companies would also be more profitable with slaves, but they don't stop producing when slavery is outlawed. It's up to the society that protects these profiteers' property to regulate them as it judges best.

          • > 30-40MHz is the point where stuff that's not properly designed just plain doesn't work)

            So design it properly. FFS.

            • OK, more precisely... 30-40MHz also happens to be the point where the demands imposed by a "proper" design ALSO outstrip what you, as an individual, can build with a home-etched 2-layer circuit board, hand solder, and/or troubleshoot when it doesn't work reliably. It's the point where things like 4-layer circuit boards with ground planes and solder masks become non-negotiable requirements. And it's the line in the sand where getting multiple devices to coexist and share that same high-speed bus becomes a RE

              • Yes. A board with a ground plane is handy for higher frequency signalling. If you have sufficient cash, you can use one the of hobbyist PCB suppliers to build the PCB for you on a 4 or 6 layer process.

                An alternative for the home hobbyist is to use impedance controlled wires, like twisted pairs or mini coax, terminated correctly. This is more fiddly, but still doable.

      • by sjames ( 1099 )

        Stuff that was hard awhile ago is definitely easier now

        Sometimes. 6502 systems come up taking memory for granted. On a modern x86 or ARM, there isn't any system memory when the CPU comes out of reset. It has to initialize the memory controller and the DRAMs themselves before it can use system memory. It's much easier to write the firmware for a 6502 based system.

        • I always assumed a microcontroller on the mainboard does all the setting up before setting starting the processor.
          • by sjames ( 1099 )

            Alas, no. In the x86 world, the initial code runs without any memory just long enough to set up 'cache as RAM mode' where the cache itself acts as RAM even though there's nowhere to write back/through to. That can be a bit fragile since cache line eviction can still occur and there's nowhere to evict it to. It the late '90s the controllers were a bit simpler and the memory could be set up using only ROM and registers.

            To top it off, it's all mostly undocumented and a few chipset manuals are works of pure fic

    • Reading between the lines, it may be that the point of the (IT/Tech) article was that a woman was the principal, rather than a man.

      Yay for that, but otherwise it seems to be celebrating mediocrity. . .
    • by mikael ( 484 )

      When you develop hardware, you use software simulators, FPGA boards with bit files (a binary representation of the hardware circuits that can be stored on flash memory) and actual silicon. The software simulator is meant to simulate the memory mapping and parallel nature of the hardware, the FPGA boards do this a bit quicker, and actual silicon is the real deal. Only the first two can be updated if things don't work as planned unless the hardware has programmable microcode.

      • Can't wait till chip manufacturers have their tools so flexible that they can offer to make small series of custom designs for an affordable price and a quick turnaround time, like what pcb prototyping services are now.
  • Geez, when I started programming all you had with stdclib. Now I have teams of people who know orders of magnitude more than I do doing my bidding. It's great!

    Seriously, though, everything is easier these days. There's a ridiculous amount of stuff out there, if you have enough time to plow through it all.

    • by Zero__Kelvin ( 151819 ) on Friday October 18, 2013 @08:06PM (#45171085) Homepage
      No. Everything is not easier today. In the mid 1990s I designed a small microcontroller based PC/104 compatible motherboard, wrote a bootloader and small OS for it, as well as the PC Application to download the firmware images to the system. Nobody could ever do that with today's processors. I wire-wrapped the entire prototype, which was possible because the clock frequency was low enough. These days you need to understand RF Analog to do PCB layout, because at 1+ GHZ clock speeds every line radiates and receives like an antenna. You have to really know your stuff, and it is literally impossible to wire wrap it. Even if you really do know your stuff, you need cash, and lots of it, to go from CAD/CAE files to working prototype.

      If the point is that you can still do old school low frequency projects from the ground up ... well obviously. But if the claim is that it is easy to do the hardware and software for a complete system on a par with contemporary designs? No friggin' way. Period.

      It is, however, easier to fool yourself into thinking it is easy, but if someone who actually knows what they are doing looks at your code they will have trouble deciding if they should laugh or cry.
      • by queazocotal ( 915608 ) on Friday October 18, 2013 @08:43PM (#45171275)

        To expand - and put this more in the context of a mobile phone class device say.

        Why open-source software works is:
        Widely available repository of code.
        Many people able to review it, or sections of it, and understand it.
        Ease of submitting tested patches.

        Hardware has problems that don't really fit well with this.
        The open schematic is the trivially easy part, and not really a problem.
        (though in practice, you need a schematic with copious links to design documents, which isn't well solved by open tools).

        The number of people who can review it is rather smaller - as you can't
        open up a c file, and see a clear error or awkwardness in code that can be edited.

        For all but the most basic errors, you are going to have to sit down and
        read several hundred pages of hardware documentation about how the chips in question work, in addition to having in-depth knowledge about the circuit design, and costings of likely changes.

        Now, you've done this, and generated a patch that you think (for example) lowers the supply current by 1%.

        Compile - test.
        On a PC, this takes a couple of minutes.

        For something of a smartphone class, a one-off PCB may cost several hundred dollars. Then the parts will cost another several hundred dollars in small quantities, as well as being difficult to obtain.
        Now, you have to solder the parts onto the board, which is a decidedly nontrivial thing - and if you decide you want someone else to do this, it's probably another several hundred dollars.

        So, you're at the thick end of a thousand dollars for a 'compile'.

        Now, you boot the device, and it exhibits random hangs.

        Neglecting the fact that you are going to need several hundred to several thousand dollars of test equipment, you now have to find
        the bug.

        Is it:
        A) The fact that unlabled 0.5*1mm component C38 is in fact 20% over the designed value, as the assembly company put the wrong one in.
        B) C38 has a tiny bridge of solder underneath it that is making intermittent contact.
        C) The chipmaker for the main chip hasn't noticed that their chip doesn't quite do what they say it will do, and the datasheet is wrong.
        D) You missed a tangential reference on page 384 of the datasheet to proper setup of the RAM chip, and it is pure coincidence that all models up till now have booted.
        E) Because you're ordering small quantities, you had to resort to getting the chips from a distributor who diddn't watch their supply chain really carefully, and your main chip has in fact been desoldered from a broken cellphone.
        F) Though the design of the circuit is correct, and the board you made matches that design, and all the parts are correct and work properly, the inherent undesired elements introduced by real life physics means it doesn't work.
        G) A completely random failure of a part that could occur with even the best design, and best manufacture.

        G - may mean that it's worthwhile making two or more of each revision - which of course boosts costs.

        Hardware is nasty.

        This gets a lot less painful of course for lower end hardware. For very limited circuits, which can be done on simple inexpensive PCBs, and be easily soldered at home - costs of a 'compile' can be in the tens of dollars, or even lower.

      • If the point is that you can still do old school low frequency projects from the ground up ...

        ... then everything is easier today!

      • But if the claim is that it is easy to do the hardware and software for a complete system on a par with contemporary designs? No friggin' way. Period.

        No one made that claim, Don Quixote. Settle down. It's a lot easier to do what you did in the 80's and 90's now though, simply because there's a lot more easy to find information.

      • I think you missed the point. They're not saying doing things at the bleeding edge today is easier than the bleeding edge yesterday, they're saying doing the things bleeding edge yesterday are easier today.

      • by xtal ( 49134 )


        The transmission line tools are all there. I've done board up ARM designs in a basement, from the IC through to bootloading.

        It wasn't wire wrap cheap, but it wasn't undoable, either. There are solid reference designs you work from. My budget was about $35,000 in 2005.

        What's different today is documentation is very good, everywhere, there are lots of reference designs, and places to get help. If the platform you're working on doesn't have those, you go somewhere else.

        The Pi has a full set of open de

        • "My budget was about $35,000 in 2005."

          I did mine for less than $300.00. See the difference?

          "The Pi has a full set of open designs. Tell me where you were going to find those in the 90's?"

          I told you ... I designed one myself from the ground up. Now if your point is that it is much easier to design hardware if you just buy it pre-designed and pre-made, then I totally agree with you. Buying it pre-designed and manufactured is indeed easier than designing it yourself.

      • Nobody makes you use clocks in the Gigaherz range. Atmel and numerous other brands still make 8 bit microcontrollers that can go as low as 32khz or as high as 20Mhz. Is up to you and all the hf stuff happens in the chip anyway. Recently designed a vending machine controller with an all thru-hole board and a clock of 3.386Mhz. No problem.
      • Wirewrapping and blinkenlichten. Sounds like fun. :)
    • There's a ridiculous amount of stuff out there, if you have enough time to plow through it all.

      That is what makes it hard these days. And once a while you still have to write programs using the standard C libraries (stdclib is C++, you lucky bastard ;-p)

  • Run! (Score:4, Insightful)

    by sycodon ( 149926 ) on Friday October 18, 2013 @08:03PM (#45171065)

    Run the opposite direction!

    Software is made up of worlds created by people hopped up on caffeine and suffering from too little sleep. Hardware follows physical laws, software follows no laws.

    Hardware is created, finalized and shipped. Software is a never ending dreary of bug fixes, upgrades and incompatibility.

    For your own sanity, stay away.

    • ... software follows no laws.

      Tell that to my compiler. I'd say if anything software is very structured; you have a limited pool of recognized syntax that can be combined in specific ways. If you're using a library, you have to adhere to its API. Ultimately your code will be running on some processor, which has a limited set of instructions it can perform. Software has no laws? Hardly.

      For your own sanity, stay away.

      Sounds like it might already be too late for you...

    • by AmiMoJo ( 196126 ) *

      Hardware is created, finalized and shipped.

      If only that were true. Almost all older devices have some kind of bodge in them. A wire soldered on here, an extra resistor there. Sometimes you can't even see where they realized a component value was wrong and re-fitted all their existing stock. The only reason it's less common now is that more things have firmware and can be bodged via a software upgrade.

  • Designing something that runs close to the DC end of the spectrum like a 6502 requires far less engineering know how than pushing into the high speed domain beyond 100MHz. That doesn't invalidate the value of designing with "old" tech when if will suffice but it isn't a viable way to change career paths.

    • I don't necessarily agree. If you're designing the chips, then designing high speed devices today (what with all of the simulation & layout tools available) isn't really any more difficult than it was for the guys that designed the 6502. Likewise, if you're designed a product that uses those chips, you have enough tools at your disposal to make sure the design somewhat works at the first iteration, compared to back then when you might spend weeks wire wrapping and reworking a prototype board before yo
  • by Anonymous Coward

    If you don't know both, you are only using half your brain

  • I was trained in college as a HW guy, did a few boards and ASICs, and now I'm a SW guy. Hardware is increasingly done in ASIC's / programmable gate arrays, and PLD's which are now written in programming languages like Verilog and VHDL. It's much like software, but massively parallel. With machines coming with more and more cores now, the differences are shrinking rapidly. The real difference is that it costs hundreds of $k to compile the final package in silicon. ;-)
    • by 0123456 ( 636235 )

      The real difference is that it costs hundreds of $k to compile the final package in silicon. ;-)

      Only hundreds of thousands? From what I remember, our masks alone used to cost $1,000,000+, and a 'cheap' metal layer fix for a chip bug was about $50,000.

  • by Murdoch5 ( 1563847 ) on Friday October 18, 2013 @11:18PM (#45172129)
    Without having a good understanding of hardware development you can not be a good Desktop or Embedded level developer.
    • by artor3 ( 1344997 )

      I strongly disagree with that notion. It's important for developers to understand how the hardware is architected in some cases, but that's not the same as having an understanding of hardware design. The developers don't need to know about coupling or latch-up or impedance matching. If they do, that's a sign that the hardware engineers did something very, very wrong.

      • I would venture that if you ask any modern computer science student about things like, CPU / MPU state, I/O load, Pipeline state, Memory allocation at the byte level, CPU / MPU interruption and more, they couldn't answer you. All of these and more need to be actively in your head well you code. It doesn't matter if you are writing a music player like iTunes of an Embedded Firmware, you need to approach the problem from the bare metal up.

        I have a friend taking a Software Engineering degree right now and w
        • I have a friend taking a Software Engineering degree right now and when I look at his code I shutter. Well his code is pretty and utilizes managed run-times and style, it's generally a resource disaster and he's admitted to me they don't even have a course on resource management. He can't tell me how many bytes of memory his program will use, how to optimize the pipeline for better run-time, how to save I/O loading through DMA requests and etc...

          But how would he even know that stuff? The issues you are describing are very platform-specific and deep down to hardware. They should be mostly the problem of the compiler and operating system.

          I agree though that you have to know what happens under the hood to be a good programmer. That could mean, for example: knowing a bit about executable formats, how a CPU works (registers, program counter, flags...), what kind of init code does the compiler insert, how are stack and heap organized, and how to use a de

          • Software Engineering courses need to be updated to include hardware based design and understanding courses. Like you clearly pointed out, you need to understand what goes on under the hood to be a good programmer.
        • All of these and more need to be actively in your head well you code.

          These are all distractions for the things that matter.

          He can't tell me how many bytes of memory his program will use

          Cross compile to 64 bits and the answer will be different. Use a different C++ compiler and the answer might be different. Compile for a different OS and the answer might be different (some structs have different sizes on different platforms).

          how to optimize the pipeline for better run-time

          This should be done by the compiler. Besides, are you assuming an x86 CPU? You *do* know software these days are expected to be portable across CPUs, right?

          how to save I/O loading through DMA requests

          WTF? Are you still programming in DOS? How do you enforce D

          • If you're going to port your code to multiple architectures then you need to understand the architectures and all the in's / out's that make them run. if you're going to program in C++, C#, C, ASM, Matlab, COBOL and etc... you need to understand how they use memory and the size of different aspect of the language. You should never just fall back and let the OS run your code without thinking how it's going to run your code and how your code will interact with your hardware. You can always tell when a prog
            • You should never just fall back and let the OS run your code without thinking how it's going to run your code and how your code will interact with your hardware

              Look, when I write Javascript, the most I would do is to look at how the 4 major browsers execute the code. I don't think about what the OS would do. Heck, given the ever evolving state of Javascript engines these days, I don't even know what the browser actually does. Does that make 99.99% web developers out there bad programmers?

              When I write Java, it could end up running on Windows, Linux, MacOS, or Android, even on other platforms. The heavily optimized JVM handles the hardware/OS level optimizations for

              • Well first of all I said desktop programming and not web programming, Java Script would be classified as a web language and not a desktop language. Secondly Java is a horrible concept, it's entire basis is to create a virtual machine you run on instead of sitting right on the OS, well I can agree it's a wide spread and commonly used language I still hate it, Lastly you can have what every belief you want but I don't agree, if someone if going to take my C code and move it to a Blackfin arch then so be it b
  • by chthon ( 580889 ) on Saturday October 19, 2013 @04:27AM (#45173265) Homepage Journal

    I am a fan of the people who build their own computers from MSI components [].

    I discovered microprocessors around 1980, when I was 14 years old, but here in little Belgium I was never able to do something with that knowledge at that time, but my interest got me a bachelors degree in electronics, and a good (better) understanding on how software works. I was always interested in FPGA, but it is only since 2010-2012 that I got finally a possibility to do more than programming. I got my master degree in electronics, and on the way I learned VHDL (one of the reasons that I wanted to go for my master degree), and got an interesting school assignment about on the fly reconfigurable hardware and a thesis involving the Spartan-6 Atlys board.

    Also, since 2004 I have been working on and off studying Common Lisp, and processor emulators.

    Well, since September 2012 I have been designing a simple microprocessor, for which I first did the implementation of an assembler in Common Lisp, and a simulator, and start of this September I finally got around implementing the simple computer system in VHDL. I was surprised how easy it was, given that I only have about 1 to 2 hours a day in the evening to work on things. It is currently a 16-bit thing which uses 64kB of FPGA block RAM.

    Thus, with software knowledge and VHDL, it should become even easier to build custom microprocessors.

    And I am not even crossing this line. It has always interested me to go for both hard- and software, but due to circumstances I ended up more on the side of software.

    Having the room for doing electronics properly is not that easy. One needs a place committed to it, which can not be used by other people in the family. For that reason, I like the concept of FPGA development boards. It lets me do what I want to, without needing to invest in dedicated space.

    The Atlys board gives me all I need for growing in the future. The first part should be to make the system run using the on-board serial controller, so that I can control it through a terminal program, having access to a keyboard and a character terminal.

    And I am not done with software, because one of my goals is to write a Lisp system for running on the system, and then start to optimize the ISA for better performance. Other things: go to a 32-bit implementation and start using the on-board 128 MB RAM memory.

  • I love hardware. Built a 6800 machine on solderless breadboards quite a few years back. Kind of a dumb thing to do durability wise, but it was quick and easy. That brings me to my point. Many new devices these days are only available as surface mount devices. Have you ever tried to solder one of those bad boys into a circuit? The Arduino and its ilk are great but shields and stuff are basically pre-built and component level design is becoming a lost art. Well, I suppose you could say that the advent of ICs

    • Re. Schmartboard adapters, here's a 0.5 mm pitch QFP to 0.1" adapter []. It's 10 bucks. OK, a BGA one is $45. But still, what are you looking at that you don't find inexpensive enough? What family of devices are you wanting to work with?

      Aside from hacking some existing proto- or dev-board with the device you want to work with (e.g. with short patch cables or headers to a breadboard or other daughter proto-boards), you should consider just biting the bullet and learning to design and solder your own SMT PCB

  • I fail to see the difficulty, or the divide, since hardware is a question of petrifying some software to enhance the operation of certain algorithms.

    I remember reading articles several years ago by Chuck Moore about what he was doing to control a silicon foundry to produce chips which would hard wire some algorithms in silicon while leaving the rest as software implementations in Forth.

    TILs (Threaded Interpreted Languages) lend themselves very well to this.

    The level of interpretation, and the repetition of

    • Not quite - hardware is only good at petrifying O(1) algorithms. They have to be tightly bounded in space and time.

  • by hamster_nz ( 656572 ) on Saturday October 19, 2013 @03:16PM (#45176261)

    I've taken the high tech way to designing hardware - FPGAs. So far I've built quite a few bits and bobs - 200MHz GPS referenced frequency counters, a 60 stage Mandelbrot pipeline (12B Ops per second @ 200MHz), SDRAM controllers, my own processors, video adapters, and implemented the DVI-D protocol. I've even worked out how to make a chip with 1Gb/s outputs work at 1.5Gb/s - yay! 1080p! Everything is in a HDL (Hardware Description Language) so can be used by others on their own projects. It isn't that expensive - dev boards start less than $100.

    As Quinn points out it takes a very long time to get everything working correctly - you software guys don't know how lucky you have it!

    I've put some of my projects on my wiki at [] if anybody wants to take a look.

Heuristics are bug ridden by definition. If they didn't have bugs, then they'd be algorithms.