Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Printer Privacy Hardware Technology

Ask Slashdot: Will We Ever Be Able To Make Our Own Computer Hardware At Home? 117

dryriver writes: The sheer extent of the data privacy catastrophe happening -- everything software/hardware potentially spies on us, and we don't get to see what is in the source code or circuit diagrams -- got me thinking about an intriguing possibility. Will it ever be possible to design and manufacture your own CPU, GPU, ASIC or RAM chip right in your own home? 3D printers already allow 3D objects to be printed at home that would previously have required an injection molding machine. Inkjet printers can do high DPI color printouts at home that would previously have required a printing press. Could this ever happen for making computer hardware? A compact home machine that can print out DIY electronic circuits right in your home or garage? Could this machine look a bit like a large inkjet printer, where you load the electronics equivalent of "premium glossy photo paper" into the printer, and out comes a printed, etched, or otherwise created integrated circuit that just needs some electricity to start working? If such a machine or "electronics printer" is technically feasible, would the powers that be ever allow us to own one?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Will We Ever Be Able To Make Our Own Computer Hardware At Home?

Comments Filter:
  • The actual things mentioned requires insane clean rooms, massive R&D, and very large expensive equipment, with zero push to allow manufacturing on a smaller scale, because there is no demand, and the manufacturing tolerances requires just arent feasible in a home environment. If you want it, just get a FPGA / CPLD, and design your chips on that. Otherwise, we can do circuit boards at home without any problem today, so the rest of the machine is easy.

    • You can make your own computers from scratch already. Its just thay they're 50 years outdated.

      Also powers that be dont control stuff in that fashion. The question is tard. Ten bucks says whoever asked it had not looked into homebtew electronics at all.

      • Yes you can do a FPGA based processor, it can run at approximately 1/20 -> 1/60 the speed have lower memory bandwidth, useless at almost everything not be able to run your OS and cost 20x more than off the shelf parts.

        And of course you can make your own parts at home, start with a 10 billion dollar fab plant, huge teams of engineers and 50 years to get the design working at the current levels of technology.

        There are all sorts of options but I think this is way out in the weeds as a suggestion.
        • A lot of the newer ones come with CPUs built into the FPGA, so that what you're designing is hardware to use with that CPU (ie, specialized highly concurrent data processing). I've worked on imaging devices where most of the heavy lifting was done with big beefy FPGAs with some DSP final processing.

          You can build your own CPUs, but the purpose of that is just to be a hobbyist. It's certainly less messy than you see on the build-your-own-CPU-out-of-surplus-TTL sites.

      • And I remember doing so. If you'd like to refresh or learn those skills, I highy recommend "www.adafruit.org" for Christmas gifts, they have a nice variety of quite simple and some quite sophisticated projects for home hacking.

    • Unless things have changed radically within the past 2 or 3 years, making your own CPU a-la-FPGA isn't necessarily guaranteed to be a data-privacy or security improvement over using a commercially-manufactured processor. The software that's used to compile Verilog or VHDL into the gate mappings for a particular FPGA is ALL completely opaque, proprietary, and more or less impossible to reverse-engineer by virtue of immense size and complexity (not to mention, an army of lawyers ready to sue you into oblivion

      • Bit slice chips offer the opportunity to make pretty good machines, far better than an 8080 and (I think) as good as a 68000.
    • just get a FPGA / CPLD

      Nailed it. I've actually considered the data rates on Cyclone 10 LP and Cyclone V GX chips, and they're pretty high. Clock rate is tricky: it might run at 925MHz, but electricity only moves so fast, and you have to organize your chips properly inside to get the timing right (if you run a tightly-coupled activity from one corner to the other, it won't work). Still, a lot works, and you can implement V-PHY and GbE on the damned things--you could make a motherboard with an edge connector that allows you t

    • Depends on the transistor size. Yeah, for bleeding edge stuff you're going to need a very expensive setup. For transistor sizes circa the 1990's and early 2000's however you can easily get the equipment to do that yourself - it will probably run you about 20k and a significant investment of time though. The bigger issue is the software to design the cores - even for FPGAs there's no really good open source set of cores (essentially the geometrical tetris blocks you have to assemble based on their shape a
  • by Dputiger ( 561114 ) on Wednesday December 11, 2019 @07:58PM (#59510574)

    You said: " Will it ever be possible to design and manufacture your own CPU, GPU, ASIC or RAM chip right in your own home?"

    The answer to this question is that it's already *possible* to build these components in your own home. The problem is that the manufacturing techniques readily available to consumers for building and wiring hardware together do not lend themselves to the rigors of modern semiconductor manufacturing.

    But can you build *something?* Hell yes you can. Check this thing out:

    https://www.extremetech.com/ex... [extremetech.com]

    It's a 16-bit CPU with 256 bytes of memory and every single component is implemented in human-scale components.

    No advance in 3D printing is going to allow you to manufacture, say, a Core i7 in your house because you lack all of the industrial manufacturing and processing tools necessary for creating the wafer that such a chip requires. But there have absolutely been explorations of using 3D printing to create circuits that can cheaply and easily be applied to all manner of surfaces, including clothing. The final product of these efforts wouldn't be the sort of silicon you'd play a game on, so it might not meet your definition of being a CPU, RAM, ASIC, etc -- but these are definitely subjects of existing research in manufacturing.

    • The MOS 6502 (of Commodore C64 and Apple II fame) had less than 4000 transistor functions and was manufactured in an 8 micrometer process (structure width approximately a three thousandths of an inch). Something like that may be feasible.
      • You mean the The MOnSter 6502 [monster6502.com]?
        • by Octorian ( 14086 )

          Except even that project depends heavily on manufacturing technologies that are simply not available outside an industrial setting. The most you could do at home with that sort of project is final assembly, and even that is better done by a big fancy pick-and-place machine.

    • A lot of companies that have their own chips don't bother making the chips themselves. Design the chips and have a foundry create the ASICs out of that. It doesn't make it less real that the actual semiconductor process was done somewhere else.

      For a person to do this themselves at home, alone, generally means you're doing a smaller design. Big stuff takes so much work it's not in the realm of a hobbyist to manage effectively. Most modern large chip designs spend more effort on the testing than on the act

      • by tlhIngan ( 30335 )

        A lot of companies that have their own chips don't bother making the chips themselves. Design the chips and have a foundry create the ASICs out of that. It doesn't make it less real that the actual semiconductor process was done somewhere else.

        For a person to do this themselves at home, alone, generally means you're doing a smaller design. Big stuff takes so much work it's not in the realm of a hobbyist to manage effectively. Most modern large chip designs spend more effort on the testing than on the actual

    • [QUOTE]If you’re wondering how a person ends up building a Megaprocessor in the first place, Newman notes that his project started as an attempt to understand the operation of a transistor, before noting “I didn’t plan on ending up here. I started by wanting to learn about transistors. Things got out of hand.”[/QUOTE]

      Quote from the developer. I like him already.
    • This is awesome, and it should be noted that this was intentionally built to be large and display all the internal workings of the device on a human scale.

      It will be really interesting to see what can be done with 3D circuit printing meant to be miniaturized. It's not hard to imagine a tiny 3D-printed relay, and from there one could easily construct an adder, or whatever else. We just have to be realistic about size and power - barring a nano-scale 3D printer breakthrough, anything substantial will look mor

  • Comment removed based on user account deletion
    • Re: Apple (Score:5, Informative)

      by Ronin Developer ( 67677 ) on Wednesday December 11, 2019 @08:06PM (#59510612)

      They used discrete components and commercially available integrated circuits and CPU. They did not manufacture the CPU or ICs. They did layout the printed circuit board.

      • The original Apple and the Apple 2 ran such low clock speeds that you could hand-wire everything and it would work just fine. Can't do that with GHz clockspeeds and LVDS.
        • For what it's worth, 8-10MHz is basically the point where you have to start being aware of things like bus capacitance and signal routing. 10-20MHz is the point where you have to really care about it. 20-40MHz is approximately the point where you have no choice, and MUST actively take it into account at all times or your hardware will fail in bizarre ways.

          That's the REAL reason why CPU speeds increased so slowly at first, then suddenly increased by leaps and bounds for a decade until we ran into the limits

          • Sounds about right. I remember the Morrow Designs S100/IEEE696 box I used for quite some time came with a 4MHz Z80A, and I was able to upgrade it to a 6MHz Z80B with no problems, but when I got my hands on an 8MHz Z80C, it was no bueno, wouldn't boot reliably at all. The bus just wasn't designed for even those speeds.
          • I designed and built a wirewrap prototype 80386ex controller that ran solid at 30MHz. I've mass produced thousands and thousands of 80188ed controllers running at 33MHz and build on a 2 layer printed circuit board. Your estimate is a bit conservative, but certainly realistic.
      • by JBMcB ( 73720 )

        They used discrete components and commercially available integrated circuits and CPU. They did not manufacture the CPU or ICs. They did layout the printed circuit board.

        Yeah, there's nothing stopping you from doing the same thing now. The article didn't say anything about building your own chips. However, people have been rolling their own cores in ASICs and FPGAs for years, so that's entirely possible. It won't be fast, though.

    • by malkavian ( 9512 )

      He used stock chips. If that was the question, I'd say an unreserved "Yes", as I did that nearly 30 years ago with a 68000 CPU, memory chips, EPROM and various wires, resistors capacitors etc.
      However, the question is about building your own CPU/GPU etc, which currently there's no way around the environmental requirements for production at that level, and there's unlikely to be anything like it for a long time (eventually, it'll likely happen, as tech has a way of making what in one century is the height of

    • by msauve ( 701917 )
      "Pretty sure Woz built the first Apple at Steve's home"

      You lost the bet you didn't make. Steve glommed onto Woz's creation. Woz:engineering::Jobs:marketing.
  • I remember quite clearly there was a program for the Apple ][ where you could design your own circuit boards, test them, and then send the schematics to a company which would make them for you. But for the life of me I don't remember the name of it.

    Anyways, a friend actually was making her own boards and putting them together to make a running machine. Have no clue how much it was costing her,

    • I'm sure there was. Just like back in the day you used to be able to buy IEEE-696 (i.e. S100) bus cards that were pad-per-hole drilled, and you basically breadboarded your custom expansion card design onto it. We're talking about the days of PCBs being two-layer only and you could literally fab them at home with minimal equipment and get a working PCB out of it. Not necessarily plated-through holes, soldermask, or silkscreening, but it would be functional. These days? You have to use a prototyping service,
    • The modern equivalent is EasyEda. Log in. Design the circuit. Lay it out. Give them $50. Boards come in the mail.
      It's pretty slick.

  • If you have the equipment to do so but I'm guessing you're not that rich so.... Duh.
  • by jamonterrell ( 517500 ) on Wednesday December 11, 2019 @08:11PM (#59510634)
    You're probably not going to make anything close to what you're using, but if you're looking to build something more modest, that's definitely possible. It really depends on what your goal is, and how advanced of parts you're willing to accept as a starting point. Ben Eater (https://eater.net/) has a guide on building an 8bit computer from low level gates. It's a really good basis to be able to build some primitive devices that would be capable of something a little more advanced. I'm building a 16 bit version, with a display and keyboard. If one was interested, they could definitely build off something like this, adding encryption and a network interface that would allow them a fairly trustworthy device. Another approach that you could take is auditing the devices you use. There are guides on how to go from XRays of chips to decoding them. Ken Shirriff is a ninja at this, and I HIGHLY recommend his talk: https://www.youtube.com/watch?... [youtube.com]
  • Some interesting work going on with semiconductor alternatives:
    https://www.designnews.com/ele... [designnews.com]
    I could picture some kind of Star Trek jelly where intersections of different lasers at points in it can change it around from insulator to conductor to semiconductor, allowing circuit design totally through photo response. Fantasy right now though.

  • Short answer: No. (Score:5, Insightful)

    by Rick Schumann ( 4662797 ) on Wednesday December 11, 2019 @08:18PM (#59510664) Journal
    Long answer: If you're asking whether you can fab 7nm silicon in your garage for pennies on the dollar instead of shelling out hundreds and hundreds for a commercially manufactured part, then the answer for all intents and purposes is 'no', you'd have to spend hundreds of thousands, if not a million, on all the equipment necessary, and that's assuming you'd be allowed to in a residential zoned neighborhood, considering all the extremely dangerous and toxic chemicals necessary to grow the wafers laser lithograph them, and so on. Instead, you could use the top-tier FPGAs that are available and program them for whatever design you want, but again if all you're after is cost savings you won't get it that way either, FPGAs that could be a full CPU, GPU, SoC, etc, that complex, are still very expensive. Then there's having the technical chops to be able to do the design work, as well as needing a multi-layer PCB design to support it. If all you're really after though is building a working computer from component parts, instead of buying pre-made PCBs and bolting them together, you can still do that, I'm pretty sure, but if you're expecting to get an 8-core 64-bit machine that can run, say, the hottest new game software, then you're not going to get that with what you can solder together at home.
    • Something I'd like to add to what I said above: I've been building computers of one sort or another since I was about 14 years old, starting with a COSMAC ELF [xi8.me] (with many expansions I designed and built myself, mind you, including integer BASIC and a 20ma current loop interface for a Model 33ASR TTY!) from a 1976 Popular Electronics article, a bunch of S110/IEEE696 stuff (including the iconic Imsai 8080, like from War Games, and some Morrow Designs stuff, plus some cards I build myself) all the way up throug
    • > Long answer: If you're asking whether you can fab 7nm silicon in your garage for pennies on the dollar instead of shelling out hundreds and hundreds for a commercially manufactured part, then the answer for all intents and purposes is 'no', you'd have to spend hundreds of thousands, if not a million, on all the equipment necessary

      Intel spent $13 BILLION last year and can't fab a 7nm CPU to compete with AMD. That's not to rag on Intel - a few years ago AMD couldn't compete with Intel on the high end.

      Th

      • Up until last year I was working out at Intel so yeah I know the difficulties they're having. It's tough enough apparently to make 10nm dies that have a high enough yield let alone stay reliable for any length of time.
    • The original poster's main concern was about privacy, not cost. The problem with an FPGA is that it doesn't help much with that. A commercial FPGA could have a backdoor or malicious features built into it just as easily as any other chip.

  • Maybe special purpose (e.g., IoT devices), but not useful general purpose hardware, not any time soon.

    What IS possible - even right now - is to crowd source for (limited) custom fabrication based on open source hardware. This isn't easy (crowd sourced hardware has a bad track record), but it's certainly possible.

  • computers are on youtube..
  • Back in the early '80s when I was at university, one of our micro-electronics professors prophesied that one day you would be able to create a complete product by putting circuits on a single wafer using a machine that sat on your desk. He thought that there could be a complete fab that would take up a cubic foot or so that would provide this functionality. He also said that this would be limited to engineers because of the depth of knowledge required to specify the requirements of the product, map out th

  • If not, the short answer is emphatically NO for the foreseeable future.

    The long answer is that without a chip fab at your disposal, you -might- be able to design the chips, but you will never be able to afford to have them produced, and if you outsource the production, you have defeated the (security by obscurity) purpose of brewing your own stuff.

    Recommend you explore FPGAs, which get more affordable all the time, and the tool chain required to support the design process.

    Alternatively, the man is alway

    • Shit bro, I've got a cpu right here, ever hear of a 555? Just buy a reel, and stock in a capacitor company.

    • You *CAN* buy your own electron microscope for a reasonable price, then delid and shave the final product you get back from the fab to verify that it matches your own designs. i.e. you may not be able to build your own fab to make the chips, but you *can* verify that your own design was implemented correctly without any extra stuff. (Lots of extra shaving required to verify that no extra spy stuff was added into the package, but totally doable).

      Your design is no longer secret, but any obscurity you threw in

  • As others have mentioned, it depends on what you mean by "computer hardware", "make our own", and "ever".

    For sure you can build something at home, now, that's a Turing-complete computer, but it will be so slow and so small as to be essentially useless. A few kilohertz clock speed (if you're lucky), a few hundred bytes of memory. If you're lucky and capable and dedicated. And have LOTS of time to waste on the project.

    Even then, you'll have to purchase commercially made resistors, capacitors, transistors,
    • A few kilohertz clock speed (if you're lucky)

      There are people on youtube spilling over with happiness at building a 2 bit hand-clocked computer.

  • Now that the parts and tools are available to make an untraceable military weapon which could be used to literally and physically over throw a government with nothing more than hand wringing, why would that same government respond to people that are creating electronics technology at home?

  • For my senior design project we designed a CPU using a software package that ran on SunOS by combining adders, loaders, registers, a clock, etc. We synthesized it onto a FPGA, and then we used YACC to make a C compiler for it, and compiled and ran some C programs on it like bubble sort, etc.

    You could do this then, and you could do it today easier and cheaper. The thing is, you likely want to run Windows and want x86 emulation. You're not going to do that on your own without years of design experience and

  • The silly tiny geometry processes in a modern semi-conductor are far out of reach for hobby purposes. It's expensive, there are dangerous chemicals involved, a lot of math and process engineering to try to get usable product, etc. With that goes the power & performance. If you'll accept higher power and lower performance though, it's always possible.

    To me the thing that is most ripe for innovation and home development are PCBs. The dimensions of the traces and layers are reachable with various types of

    • To me the thing that is most ripe for innovation and home development are PCBs. The dimensions of the traces and layers are reachable with various types of equipment readily at our disposal for cheap. 2 layer PCBs have been possible for decades with a laser printer and some chemicals, but inadequate to any modern design. It seems like there might be better ways of building up PCBs (for very low volumes) that might be both cleaner and just as effective. I'm just not sure anyone with the correct skills is act

  • We will probably have a Star Trek replicator one day, you will still be relying on open source designs and nobody will be able to fully know that there are not some nasties in there.
  • Eventually when you will be able to buy a machine which created motherboards at home, the NSA would have already added a piece of control code into THAT machine so that when it prints out a motherboard it includes the NSA spy chip on it.

    Its rabbitholes within rabbitholes all the way down.

  • Real chip fabs are complex, dangerous things. They use enormously hazardous chemicals and large amounts of very clean and stable power, and require the cleanest of clean room facilities. It's conceivable however that eventually we'll come up with some kind of variation on the tunneling electron microscope that's affordable to hobbyists, and that is capable of producing high-performance electronics. After all, the question is "will we ever be able to", not "will we soon be able to". It's wholly conceivable,

  • You can do it now... (Score:5, Informative)

    by LynnwoodRooster ( 966895 ) on Wednesday December 11, 2019 @09:31PM (#59510976) Journal
    Kids these days, never learned how to make a basic D or J/K flip flop from transistors? You can build up your own ALU and other CPU building blocks quite simply. Yes, you can build one today, at home - just lay it out with discrete components on a PCB, you can easily fab your own 2 layer PCB at home... Hell, wire-wrap a bunch of TO-92 parts if you want - it'll work!
  • Ultimately is will be done with something like the replicators seen in Star Trek..devices which will 'print' anything by building it from the atomic level. Stored energy would be converted to the requisite particles following a pattern of the user's choice.
  • by DeadBeef ( 15 ) on Wednesday December 11, 2019 @09:48PM (#59511022) Homepage

    Look up Ben Eater's 8 bit breadboard computer on youtube for an example of what you can build from discrete components with a ton of patience. He does use some off the shelf parts, but he does demonstrate how you would go about building those parts right from discrete parts like transistors and resistors.

    Depending on how much you want to compromise by buying off the shelf you can end up with something that can only just add a couple of numbers together and blink some LEDs to give you the answer or end up with a mostly functional 8 bit computer from the 80's.

  • components, you can barely work on a board at home. Have you seen the pin pitches on modern IC's? There is a reason adafruit sells quite a few "breakout boards". These simple boards stick an SMD IC device along with some SMD cap's and R's along with a LDO and then some pins at reasonable pitches to access the IC pins.
    • With SMD components, you can barely work on a board at home.

      You really can. You can buy a perfectly servicable hot air rework station for under $100 and a reflow oven for $200 if you don't feel like winging it with a toaster oven.

      Have you seen the pin pitches on modern IC's?

      Yes. 0.5mm pitch is not uncommon. I can place those and 0402 passives by hand without too much trouble.

      There is a reason adafruit sells quite a few "breakout boards". These simple boards stick an SMD IC device along with some SMD cap's

  • Not a computer, but a IC: http://sam.zeloof.xyz/first-ic... [zeloof.xyz] this guy just bought everything, started to learn and created his first IC, it works
  • is assemble a system that you can monitor completely with a rudimentary logic analyzer.

    something like a 386 perhaps which you can decap to inspect and recap. With old gen fab techniques. Might even be possible to inspect it with a hobby microscope.

  • We're not there (Score:4, Interesting)

    by sjames ( 1099 ) on Thursday December 12, 2019 @03:07AM (#59511568) Homepage Journal

    Fabbing silicon is a high hurdle. Lots of large and expensive high precision equipment and dangerous chemicals needed. Huge power requirements.

    OTOH, I have seen a great deal of progress with making PCBs. The most interesting approach I have seen is to spray a copper clad board with flat black spraypaint, burn it off where you want to etch using an engraving laser on an XY table (or modified 3d printer), etch with driveway cleaner from the hardware store and hydrogen peroxide from the drugstore, spray and laser it again to expose the solder pads, and reflow it with a cheap electric frying pan. A laser cutter can make you a solder stencil.

    While not exactly what you asked, it's a big step up over the old days of hand drawn etch resist and through hole soldering with an iron (or wire wrap). Then you can 3D print a case rather than making do with an Altoids tin or a generic project box.

    For home fabbing, never say never, but I'm not holding my breath.

    • How much power, how many dangerous chemicals, and how expensive the equipment, if you're only printing one chip instead of thousands a day?

      In 10 years from now?

      20?

  • by Xenna ( 37238 ) on Thursday December 12, 2019 @04:03AM (#59511628)

    https://hackaday.com/2010/03/1... [hackaday.com]

    But she's more into AR these days:

    https://www.theverge.com/2019/... [theverge.com]

  • We have arrived at a point where it is pretty much possible for anyone (with the skill and the intent to invest a bit of money) to make hardware on par with equipment from the 1960s and maybe 1970s. Do you want to, that's the question.

    If your question is whether you can make something on par with an i9, 256 gigs of ram and a contemporary graphics card, the answer is no.

  • They're not going to rip any threads but here's a few that people all over the world have been building

    Homebrew CPU ring [homebrewcpuring.org]

  • by dskoll ( 99328 )

    At least, not unless you want a furnace, a clean room, all sorts of nasty chemicals, an X-ray lithography machine, etc, etc. There's a reason state-of-the-art chip fabs cost upwards of a billion dollars.

    Maybe you could scrape together a 1970s-era single metal layer NMOS fab with 10 micron feature sizes, but that's not going to be terribly useful.

    • Isn't the real reason they cost billions the fact that they are factories for churning out millions of chips per year? Isn't the need for a clean room based on needing to have dozens of people working with thousands of chips? Is the nastiness of the chemicals insurmountable, or simply a matter of packaging? Why not put the etching chemicals in a tiny cartridge (like ink/toner), and drain them into a waste bottle (like laser toner)?

      You need a massive factory to produce plastic widgets at an economically

  • As many others have said (and given some great detail), no you cannot fabricate 2019-era commercial CPUs at home, not happening, even assuming you had the expertise to design one (another huge hurdle). Yes you can absolutely make 70s-era commercial CPUs (or something with equivalent capability) if you are dedicated and patient. So by extrapolating, I'd guess that by the 2060s or 2070s you should be able to fabricate something of equivalent capability to a 2019 CPU at home. Maybe. But by then what use wi
  • On a long enough timeline (assuming humans still exist) the technology to 3d print at or near the atomic level will highly likely exist. In such a scenario we can 3d print exact copies of the 3d printer which should make them somewhat affordable. At that point you can print whatever you can design and have the raw materials for - potentially including living being if you can print fast enough. However the "We" in this case is humans far in the future. No chance anyone alive today will see it unless scie
  • I don't know how long it will be, but why wouldn't it be possible at some point in the not-too-distant future? Whether or not we'll still be using silicon by then I can't say, but I don't see any insurmountable roadblocks to the idea. You don't need a "clean room" to print one chip, you just need a tightly sealed "clean box". You don't need to worry about handling or disposing of dangerous chemicals any more than you need to worry about the toner carts and waste toner bin in a laser printer. Everything
    • Will we be able to in the future? Yeah, why wouldn't we?
      If we knew how to do it we already would do it.
      Thinking it will be possible because of future is just silly

  • ...don't put your hands in the sink." I had a colleague who designed, printed on acetate, and acid-etched his own PCB's when he was making a robot. It was 2001. And the "Don't turn on the lights and don't put your hands in the sink" is what he said when his wife came home in the middle of the etching part of the process.
  • by zmooc ( 33175 ) <zmooc@zmooc.DEGASnet minus painter> on Thursday December 12, 2019 @08:39AM (#59512020) Homepage

    Yes. You already can.

    http://sam.zeloof.xyz/first-ic... [zeloof.xyz]

    Will it ever make so much economic sense that we will start to see ready to use solutions on the market like we now have 3D-printers? Probably not. Making very precise tiny stuff will always require inherently very expensive hardware and making very precise tiny stuff that makes economic sense compared to what the big manufacturers make? Never.

    However, we will absolutely see 3D printers that can "print" relatively coarse circuits but you will not use them to make computer hardware that you will actually use. A notable thing I expect we will see one day is 3D printing compliant structures with built in (pressure/flex/touch) sensors and accompanying circuitry. Printing specialized robot limbs/tools with such integrated circuitry for use on your home robot will absolutely become a thing.

    However #2. You can make your own hardware at home today. Just order a machine from ASML. They do offer refurbished ones too!

  • Thereâ(TM)s no way you will ever have a computer in your home. They are room-sized machines with thousands of tubes that burn out frequently, and the power requirements alone ... also there would need to be a way to dissipate the enormous heat. And what would the average homeowner do with their own computer? Nope, it will never happen.
  • The answer to this question depends on where in the process of "build a computer" you start.

    IMHO - The dividing line between "Yes-It's possible" and "No way" is clearly with building the semiconductors. IF you buy your semiconductors, then sure, it's *possible* to build something that passes as a computer in your spare room. Building a modern computer though, with surface mounted components and high clock rates won't be possible without doing some serious level of outsourcing.

    Raw Materials - Not happen

  • To succeed, ensure no one tracked purchases of the required equipment, materials, on-line searches for how-to's and 3-d printer files. Then ensure the equipment used to build the components did not call home announcing what was created. Then overcome all the technical challenges and hazards. Probably best to code an OS to ensure that is spyware free as well.
    After all that, the absence of any signals from the normal expected hardware "features" would likely cause the computer to be flagged as an anomaly as

  • by ceoyoyo ( 59147 )

    With a bit of persistence it's no problem. You can take the Babbage approach and build a mechanical one if you want. If you're going to go electronic, you can construct your own vacuum tubes. I expect someone with a bit of talent and lots of creativity could make okayish transistors too.

    If you trust transistors (or make your own) it's not hard to make a computer. We did it when I was an undergrad CS. They don't do much by modern standards, but it's a computer.

    There's really not that much hardware spying on

  • Data security is determined by the design of the operating system.
    The hardware is irrelevant.

No spitting on the Bus! Thank you, The Mgt.

Working...