Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Power Programming Hardware IT Technology

Developer Finds USB Chargers Have as Much Processing Power as the Apollo 11 Guidance Computers (gizmodo.com) 110

An anonymous reader shares a report: It comes as no surprise that the guidance computers aboard the Apollo 11 spacecraft were impossibly primitive compared to the pocket computers we all carry around 50 years later. But on his website, an Apple developer analyzed the tech specs even further and found that even something as simple as a modern USB charger is packed with more processing power. Forrest Heller, a software developer who formerly worked on Occipital's Structure 3D scanner accessory for mobile devices, but who now works for Apple, broke down the numbers when it comes to the processing power, memory, and storage capacity of Google's 18W Pixel charger, Huawei's 40W SuperCharge, the Anker PowerPort Atom PD 2 charger, and the Apollo 11 guidance computer, also referred to as the AGC. It's not easy to directly compare those modern devices with the 50-year-old AGC, which was custom developed by NASA for controlling and automating the guidance and navigation systems aboard the Apollo 11 spacecraft.

In a time when computers were the size of giant rooms, the AGC was contained in a box just a few feet in length because it was one of the first computers to be made with integrated circuits. Instead of plopping in an off the shelf processor, NASA's engineers designed and built the AGC with somewhere around 5,600 electronic gates that were capable of performing nearly 40,000 simple mathematical calculations every second. While we measure processor speeds in gigahertz these days, the AGC chugged along at 1.024 MHz. By comparison, the Anker PowerPort Atom PD 2 USB-C charger includes a Cypress CYPD4225 processor running at 48 MHz with the twice the RAM of the AGC, and almost twice the storage space for software instructions.

This discussion has been archived. No new comments can be posted.

Developer Finds USB Chargers Have as Much Processing Power as the Apollo 11 Guidance Computers

Comments Filter:
  • I find it truly shocking, yes, I am shocked! that technology can advance over time! Who would've thought that in 50 years we could reduce a brand new technology object from feet to inches after only 50 years? Shocking! Next, they'll be telling us how cars are faster -and safer, medicine saves more lives and farmers grow more food. This tech stuff really blows my mind! I should go mine a Bitcoin to celebrate.
    • by Anonymous Coward

      Yeah! And as we all know, all technologies progress at the same rate!

      Why, the Concorde is just as slow compared to a modern jet! Uh, wait. Or maybe the SR-71? Or the XB-70?

      Almost as if processing information is fundamentally different from physical pursuits.

      https://dothemath.ucsd.edu/201... [ucsd.edu]

    • Your Tesla is also slower than the old 1967 Dodge Coronet grudge match car I built, which - by virtue of regularly running high 9s in the 1/4 mile, had to have all kinds of safety gear like a roll cage, parachute, wheelie bars, scatter shield, battery cutoff, etc. And I built it for under $10,000 in my own garage. Now days, it takes $100K and a bunch of advanced electronics to get within a few seconds of that 1/4 mile time!
    • by ranton ( 36917 ) on Friday February 14, 2020 @10:33AM (#59728110)

      What shocks (some) people is not that technology has progressed. They marvel at how much was accomplished with technology we now think of as very basic. It is similar to how we feel about wonders of the ancient world like the pyramids. We could certainly build them today, and we would likely do it easier and quicker with concrete. But the fact the ancient world was able to accomplish it with primitive technology still fills people with awe and wonder.

      • Re: (Score:3, Informative)

        It is similar to how we feel about wonders of the ancient world like the pyramids. We could certainly build them today, and we would likely do it easier and quicker with concrete.

        True, but then they wouldn't last as long either. Just about all modern structures use reinforced concrete which contains rebar. Unfortunately rebar is made of steel. Even though it's in concrete, it will still corrode much faster than most people think. The typical expected lifespan of a reinforced concrete structure is 50 to 100 years at best. The pyramids are 4500 years old.

      • The thing I find shocking is how now even very simple operations are dependent on very complex hardware. As an example, half of the things people will use a raspberry pi for could be accomplished with a few basic electronic components. While I realize in many of these cases the pi is actually the cheapest option, it seems odd to bring so much added complexity into a fairly simple task, even if it is mostly hidden from the user.
    • Todays Honda civics have a computer that is just running the engine that has more power than the Apollo AGC had. That also doesn't account for the computer running the 6 speed automatic transmission in the Civic either.
    • by klubar ( 591384 )

      Alas, the physics fundamentals of putting something in orbit haven't changed in the last 50 (or last 13 billion) years. It still takes a lot of energy to move something from earth to orbit.

      And beyond this, there really haven't be major break-thrus on materials, propellants or design. (A few, but nothing that someone 50 years ago wouldn't recognize.) Perhaps the simulations and calculations are faster and easier now, but launching something to space is still about the same as it was.

  • A lot of the processing was performed on IBM mainframes and transmitted during the mission.
    • by K. S. Kyosuke ( 729550 ) on Friday February 14, 2020 @09:17AM (#59727842)
      The AGC was designed with a fully autonomous mission in mind, though.
      • Precisely. The mission could hypothetically been executed without any ground support aside from recovery. They did it that way because they were concerned that the Soviets would attempt to interfere with it, and the easiest way would be jamming the uplink.

                As it turned out, it probably would have been successful had they needed to do it, but of course they didn't.

    • Probably done on the S/360, which handled air traffic control until 1989!

    • by AmiMoJo ( 196126 ) on Friday February 14, 2020 @10:27AM (#59728090) Homepage Journal

      Also the reason the chargers are so complex is not because they have to be, it's because computing power is so insanely cheap. The designers will have thrown an ARM microcontroller into the charge management IC, an off-the-shelf design, and thrown an existing USB stack on it. It could be done more efficiently but why bother? In fact for volume manufacturing it's probably cheaper to use a known, tested design even if it needs a bit more silicon.

      Apollo was state of the art because they needed that. USB chargers are complex because they are optimized for mass production and rapid iteration.

      • Quite. I mean you can buy MCUs with less RAM than the ACG, but you'd have a hard job finding one with less computing power. Even the 64 byte RAM ATtiny's can bobble along at 16MHz, one cycle per instruction.

        Maybe an Epson 4 bitter?

        • by AmiMoJo ( 196126 )

          It's going to be hard to find something as slow as the AGC... I think there were some early MCUs that used a 1 bit ALU, maybe those.

          • It's going to be hard to find something as slow as the AGC... I think there were some early MCUs that used a 1 bit ALU, maybe those.

            Yep. A modern 8051 will easily hit 32MHz and has none of that 12 cycle clock divider stuff. Not single cycle instructions but an 8 bit divide is only 5 cycles.

            OK found one. The EM6503, nominal 32768Hz, 2 cycles per instruction, 4 bits wide. Obviously available in mask ROM format.

          • I worked for a water quality instrumentation company as their de-facto lead engineer (no degree just more electronics background than anyone else in the company) and I designed something for them using PIC chip in an 8-pin DIP package that used an external 32.768KHz crystal as it's clock source instead of it's internal 4MHz calibrated RC clock (had to do with battery current draw and instruction cycle time).
            I'm sitting here reading this, thinking about that, and thinking "there's PIC chips I looked at even
            • by AmiMoJo ( 196126 )

              I think some PICs can be limited to sub 1MHz speeds at low voltage.

            • A lot of the PICs are fully static and can be clocked right down to nothing with an astonishingly low power draw and they'll all run off crystals. I'm guessing the 32768Hz came about because the only low frequency crystals were the ones made for cheap, low power timing. I don't think I've ever seen a comparably low frequency crystal which isn't 2^15 Hz.

              • The project was an inexpensive insertion flow meter, which was intended to insert into a pipe for irrigation water. We build our own paddlewheel-style pickup with small rare earth magnets in the vanes, and the magnetic pickup was a rather novel device that used different metals twisted together in a helix, which gave it the interesting property of delivering sharp pulses when the magnetic field it was presented with reversed. The customer-facing meter itself was an off-the-shelf Veeder-Root digital meter. T
                • That sounds like a pretty cool project. I'll bet the power draw at 32kHz was utterly tiny. Also interesting about the wires and magnetic pulses, I've not heard of something like that before. Did that avoid the need for a PIC with an ADC or did it make it more accurate.

      • . The designers will have thrown an ARM microcontroller into the charge management IC, an off-the-shelf design, and thrown an existing USB stack on it. It could be done more efficiently but why bother?

        This right here! I recently switched from 8bit microcontrollers to 32bit ARMs for my embedded projects. Nothing at all to do with incapability of the former, but the ARMs are literally 1/3rd of the cost. That they are close to an order of magnitude faster and have close to an order of magnitude more RAM to say nothing of their feature set is completely irrelevant.

        You can't buy parts as slow and underfeatured as those used in older computers anymore.

        • Out of curiosity, what do you use for an OS?

          Can you get predictable interrupt latency like you can with the Atmel parts? One big reason I still use them is knowing that I can count on the program execution to do exactly what I want, when I want it to.

          Cheers!

          • Nope lower level than that. No OS, I'm talking about Cortex M0 chips which I typically code in C directly. You don't need an OS just because something has ARM in the name. I treat them just like a more complex AVR.

            There's no reason these days to stick to 8bit uCs with chips like the STM32 or the SAMD series out there, especially since the latter are often cheaper unless you're doing something incredibly small where a micro microcontroller like an ATTINY or similar would suit.

            • I think the only benefit to an 8-bit would be ultra low power applications but there are a lot of arm cores that sip power too.
          • by sjames ( 1099 )

            I'm looking into some of that now myself. FreeRTOS and it's commercial variants seem like a popular choice for "Operating System" (if you can call it that). Atmel is also offering ARM uC these days. I have a SAMD51 waiting for a little spare time to check out.

            The whole world seems to be moving to 3.3v these days. One reason I like the old AVR is that it really doesn't seem to mind poorly conditioned power and as long as you slow the clock down, undervoltage doesn't phase it. It's almost silly how long an AV

        • ...but the ARMs are literally 1/3rd of the cost.

          Yep, ATmega licensing fees are so through-the-roof nuts many designs are moving to ARM just to avoid them. When the accountants are squeezing cents/unit out of manufacturing costs saving several dollars on the CPU is a gimmie.

      • by tlhIngan ( 30335 )

        Also the reason the chargers are so complex is not because they have to be, it's because computing power is so insanely cheap. The designers will have thrown an ARM microcontroller into the charge management IC, an off-the-shelf design, and thrown an existing USB stack on it. It could be done more efficiently but why bother? In fact for volume manufacturing it's probably cheaper to use a known, tested design even if it needs a bit more silicon.

        Apollo was state of the art because they needed that. USB charge

  • This keeps coming up - how laughable the processing power was on Apollo 11. But when did humans walk on the moon SINCE then? Technology has come a long way but we're not in a space race right now. Processing power of today is obviously not needed to reach the moon and make a return trip. The most important things are rocket technology, trained astronauts and dedication. Computers just make things easier.
  • Apparently the chip used is a controller chip, so I guess that's easier than custom-designing something simple for charging.

    USB is really sophisticated when you think about it. If nothing else the data rate of 10 Gbps would have blown people's mind in the 1960's.

    • by omnichad ( 1198475 ) on Friday February 14, 2020 @09:25AM (#59727880) Homepage

      If nothing else the data rate of 10 Gbps would have blown people's mind in the 1960's.

      I don't know...there were a lot more station wagons on the road back then.

      • by AmiMoJo ( 196126 )

        IBM's fastest tape drive of the 1960s, the 2420 released in 1968, was only capable of about 312k/sec or 2.5Mbps. So it was 4000x slower... Which is actually not as much as I expected.

    • by PPH ( 736903 )

      This is the USB charger we are talking about. Yeah, 10 Gbps for communications is great. But all the charger has to do is handshake with the device, set a current level and maybe generate a PWM signal to regulate the output.

  • Its control protocol is literally "Ethernet reimagined"!
    Of course totally incompatible, and created from an NIH standpoint.
    Madness, I tell you. 0.8 WhatWGs on the madness scale!

  • by jovius ( 974690 ) on Friday February 14, 2020 @09:40AM (#59727940)

    These comparisons often come about, but who would have a USB charger guiding a space craft? There's a huge amount of work involved besides just having the processing power. The computer aboard Apollo was specially build for a certain task, and it did that very well.

    This is like comparing a manual on/off switch to one that can be controlled with an iPhone. Yay that you can control billion switches at the same time, when you only need to control that one in particular in the most reliable way.

    • by hawk ( 1151 )

      >These comparisons often come about, but who would have a USB charger guiding a space craft?

      Presumably, a non-smoker who didn't need to to use the socket for a lighter? :_)

      hawk

    • by djinn6 ( 1868030 )

      These comparisons often come about, but who would have a USB charger guiding a space craft?

      No, but a Raspberry Pi might. In fact, I suspect at least one of those cube sats uses it. It's definitely overpowered with a 1.2 GHz quad core CPU and 1 GB of RAM.

  • by LynnwoodRooster ( 966895 ) on Friday February 14, 2020 @09:41AM (#59727942) Journal
    Serial used to mean maybe 2 or 3 lines for communications (one for data, one for clock, maybe one for handshaking). Remember the first USB ports? They had two data lines (differential self-clocking data) and then a power and ground pin. Four pins. Now? Serial has grown (USB C) to 24 pin connectors... That's as big as those old IEEE-488 (GPIB) ports of old, and just 1 shy of a Centronics parallel port!
    • by guruevi ( 827432 )

      Most of the pins on USB-C are used for power delivery and half of them are necessary because they are reversible. Data is still only 2 lines, TX and RX on differential lines. The primary problem with the entire USB standard is that they keep trying to cram legacy stuff (USB-2 and audio functionality) into a cable when chips could do so much with much simpler wiring.

      • No, here is what is in a USB C connector [wikipedia.org]. There are four pins for VBUS (power), and there are four ground pins. Two are for power negotiation (configuration channels). The rest (14) are for data. There are 6 serial paths, four super-speed differential, one high-speed differential, and then the side bus (it's like an I2C slow speed config channel).

        Only the four pins used for high-speed differential are legacy pins (D+/D-) and they only need 2, but they double them up to make it "flippable". Most of the

      • Most of the pins on USB-C are used for power delivery and half of them are necessary because they are reversible

        That's not even remotely correct.
        - Firstly only two pins are used to identify the orientation of the USB-C connector, and those pins are used for protocol negotiation as well and each provide a different function, so the reversible nature of the connectors doesn't cause any duplication of pins what so ever.
        - Secondly USB-PD does not use any additional pins over the standard power pins in the USB port (4 for voltage of the 24). You're probably thinking of Apple's Lightning connector which indeed does re-purp

  • by nashv ( 1479253 ) on Friday February 14, 2020 @09:42AM (#59727950) Homepage

    ..the Apollo 11 computer computer had more processing power AND creative imagination than the morons who find this even mildly interesting. I guess we chould just replace these morons with USB chargers.

    The Apollo 11 computer is worth something today because of history. Because of the part it played in something hard, challenging and a leap for mankind. Not because of its specs.

  • by FudRucker ( 866063 ) on Friday February 14, 2020 @09:57AM (#59728006)
    a Beowulf cluster of USB chargers on a a few power sticks
  • by DarkRookie2 ( 5551422 ) on Friday February 14, 2020 @10:04AM (#59728020)
    Power Adapter/AC adapter
    I have yet to see one of those charge my battery.
    They provide power to the phone no problem, and then the phone charges the battery.
    I really hate calling them chargers since that is incorrect and is only correct now because people are fucking idiots.
    • One thing seems to be sure: an idiot today is equal to an idiot back then.

      Although - people of today _should_ know more. So in essence and following the logic of these comparisons an idiot of today is more of an idiot than the ones before.

      • There's a whole era of people who called remote controls "converters."

        Was it at all accurate? No.
        Was it clear what they were referring to? Sure, it was probably even clear to pedants.
        Were they idiots for using the wrong name for the device? Probably not, though it seems pedants think otherwise.

        • Different words exist to describe different things.
          If not, what is the point of language then.
          • But sometimes, a single word can mean multiple things in a language: lighter.

            Do I mean the small portable device that can make fire?
            Do I mean something less heavy?
            Do I mean a less dark colour/tone?

            Yes, context is everything, but it doesn't help comprehension if you're not using your native language.

    • This terminology predates phones, laptops, USB, and your autism diagnosis.

    • I have yet to see one of those charge my battery.

      OnePlus phones come with actual chargers. The cable looks like USB-C, but the AC adapter actually sends a charging current matched to the battery voltage to the phone, for fast-charging at 3.4 A. Oneplus claims that not having a high-power DC/DC converter inside the phone is better.

    • "I have yet to see one of those charge my battery."

      I have not seen one charge your battery either, but I have seen them charge *my* battery.

  • by Brett Buck ( 811747 ) on Friday February 14, 2020 @10:52AM (#59728176)

    Yes, it didn't have much processing power. That wasn't a show-stopper, it probably ensured success, by keeping OUT cruft and over-blown interfaces, and, most importantly, the "programming experts" that would descend on the system now. Software development is the biggest obstacle to returning to the moon now - and it was an interesting but not crippling problem then.

    It would take far longer to develop and "test" today. I put "test" in quotes because virtually no one, least of all most current programmers, has any clue what "test" used to mean, or how what they call tests now don't meet the standard. And the complexity of just the bootstrap code would be such that you would never actually successfully test is in the same sense they meant back in the day, much less the entire flight software.

    You would have people defining "objects" and deciding which "libraries" to use, arguning which OS to pick (say, VxWorks, or whatever the MATLAB realtime system is called), etc. It was a nightmare even a 5 years after Apollo on the Shuttle, and you had the same bunch of numbnuts laughing about "ha ha, it's only a 386".

          Give most people 10,000 times the processing power, and most programmers would try to find a way to use it. That's why, say, boot times of a brand new Dell Windows machine is hardly any better - and frequently slower.than the boot time of an IBM PC AT from 1987.

    You would be better off just porting LUMINARY or whatever other assembly code that ran in the AGC straight across to a cell phone charger and using it unmodified than you would trying to get a bunch of the current self-proclaimed experts and giving them free rein.

          The software is not the point of the mission, but it would quickly become the long pole in the tent.
       

  • In a time when computers were the size of giant rooms...

    "In a time", really? You leave me no choice [youtube.com].

  • 50 years later and handheld computers more powerful that a supercomputer of the era and we canâ(TM)t even send a manned capsule to the ISS.

    • The capsule will probably just crash into the station when the computer reboots for automatic update during docking procedures.

  • but none of them had the power of a Saturn V.
  • Comment removed based on user account deletion
  • ...otherwise they'd all have perished during test runs and never got to the moon and back because by the time they switched on their rocket their apps would have gotten16 bug fix releases, 12 feature-bloat updates and 3 crashes a day with testing, staging and production deployment done the same day on their disposable navigation systems which were obsolete and unsupported by the time they charged them for the first time. Try getting to the moon with that!

news: gotcha

Working...