Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Power

Sam Altman Says AI Depends On Energy Breakthrough (reuters.com) 105

An anonymous reader quotes a report from Reuters: OpenAI's CEO Sam Altman on Tuesday said an energy breakthrough is necessary for future artificial intelligence, which will consume vastly more power than people have expected. Speaking at a Bloomberg event on the sidelines of the World Economic Forum's annual meeting in Davos, Altman said the silver lining is that more climate-friendly sources of energy, particularly nuclear fusion or cheaper solar power and storage, are the way forward for AI. "There's no way to get there without a breakthrough," he said. "It motivates us to go invest more in fusion."

In 2021, Altman personally provided $375 million to private U.S. nuclear fusion company Helion Energy, which since has signed a deal to provide energy to Microsoft in future years. Microsoft is OpenAI's biggest financial backer and provides it computing resources for AI. Altman said he wished the world would embrace nuclear fission as an energy source as well.
Further reading: Microsoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactors
This discussion has been archived. No new comments can be posted.

Sam Altman Says AI Depends On Energy Breakthrough

Comments Filter:
  • Why is it that these guys are always clamoring and tripping over themselves to state the trite and the obvious? And why the fuck does anybody listen to them or care what they say?

    • Re:Duh! (Score:5, Insightful)

      by Mr. Dollar Ton ( 5495648 ) on Friday January 19, 2024 @09:48AM (#64172391)

      It isn't obvious to me. Sam Altman hitting energy limits most likely means his algorithms are shit. I meet human brains that work better than the "AI" every day on less than 3kcal.

      • Yeah, but it took somewhere between 18 and ca. 28 years to train them, depending on specialization. Going with the average (23) and 2500 kcal per day (kids don't need that much, let's keep it simple), that's about 88 GJ. If you want that brain trained within a week, that's 145 kW... Huh, you're right, I guess his algorithms suck indeed.
      • off by a factor 10 (Score:5, Interesting)

        by aepervius ( 535155 ) on Friday January 19, 2024 @12:22PM (#64172911)
        The brain consume for the average adult 0.3 kWh per day (https://bond.edu.au/news/how-much-energy-do-we-expend-using-our-brains) or about 0.3 kCal not 3 (

        And itâ(TM)s equivalent to 260 calories or 1,088 kilojoules (kJ) a day

        ) as a fun comparison , that correspond to the energy in about about 65g of sugar. But your point still stand.

      • Yeah thatâ(TM)s the humbling thing, any computer that approaches a human brain in raw processing power (ignoring the interconnection issue) needs Hoover Dam to power it whereas the human brain is what? Eight volts?
  • I few seconds ago I just saw a human, a being with intelligence. They take some energy to operate, but not that much. I wonder what he's cooking.
    • Re:Humans (Score:4, Informative)

      by greytree ( 7124971 ) on Friday January 19, 2024 @08:21AM (#64172081)
      As I mentioned in another comment, real AI requires a theory breakthrough.
      Throwing more and more power and data at Fake Chinese Room AI will not give us Real AI, it will just fake it better.

      So of course chancers like Altman will keep telling us "We just need more power, Captain".
      Just like he told us OpenAI was Open Source and not-for-profit.

      Fuck him.
      • All the problems with AI will always be solved by "bigger models" or "more training" and, of course, throwing more money at it.

        Driverless cars are a great example. All the examples of them behaving weirdly is their version of the "hallucinations" of LLMs. Only they have serious power constraints, so you'll probably end up in a place where tuning the model to cover a new scenario winds up losing something else.

        • > All the problems with AI will always be solved by "bigger models" or "more training"...
          > Driverless cars are a great example

          Indeed.

          Tesla now has several billion times more data than it did when it started with FSD. Despite this, the quality of driving has not improved several billion times. It has improved perhaps two times and is still hundreds of times away from where it has to be. If the diminishing returns trend continues, the sun will burn out before that happens.

          I have seen no reason to believ

        • All the problems with AI will always be solved by "bigger models" or "more training" and, of course, throwing more money at it.
          No, the problem is that AI is now a buzzword that is connected to large artificial neural networks where the main use are large language models.

          Driverless cars are a great example.
          Exactly. As the real driverless cars (as those in Europe) use hand coded algorithms and have nothing to do with neural networks. Only Tesla thinks otherwise.

      • Re: (Score:2, Interesting)

        by ThosLives ( 686517 )

        A change in theory, yes, and I'm in the camp that thinks a switch to analog computing is going to be a big part of it.

        Stop trying to "compute" things and instead create systems that just evolve in the desired manner and measure the relevant parameters.

  • Or, you know (Score:4, Insightful)

    by Barny ( 103770 ) on Friday January 19, 2024 @08:22AM (#64172087) Journal

    Or, you know, we could put the toys aside and get back to actually working on those new power systems so that the world as it is can have enough cheap, clean power that people everywhere can afford to turn a light on.

    No, it's not whataboutism. They (AI) are perfectly capable of causing all the same power issues that crypto did—and will happily drive companies to burn more and more coal and gas to do so.

    • Re: (Score:2, Funny)

      But, but, but, fusion power's only 10 years away!
      • OK, which humourless twat down-modded a joke?
      • by XXongo ( 3986865 )

        But, but, but, fusion power's only 10 years away!

        Wow, a breakthrough! It's been 40 years away for the last 80 years.

        • Recently I came across a magazine from the early 1950s or late 1940s in an antique store. There was an article about fusion experiments at Cern, saying 'this miracle source of energy will be powering the world of your grandkids!' Unlikely, unfortunately.
          • When did CERN do "fusion experiments"?

            • I wish I'd bought the magazine. And I saw it more than a decade ago, but I could have sworn it was at CERN. Or at least in Switzerland. Yet I can't find any references online, but that doesn't mean anything; for all I know the PR piece was discussing theorizing only?
              • Hm, interesting. I'm helping a little in some bits of experiments at CERN (particle physics), and I haven't heard anything about fusion fusion, that is, the one for energy production. There are some nuclear physics experiments, but all are about heavy nuclei https://home.cern/science/expe... [home.cern] Most everything else is boring qcd.

                I'll have to ask if I bump into one of the retired colleagues from years ago. Maybe they killed it like they killed neutrino research a few years back.

    • by AmiMoJo ( 196126 )

      We already have all the tools we need. There is some more development to do with deep water offshore wind turbines, but even those are now pretty mature and in mass production. We know how to transport energy over long distances with HV DC lines.

      If Altman wants more power for AI he can just build it himself. Solar panels, windmills, batteries. Stop complaining and get on with it.

    • We need to get the little people to stop using air conditioning and other modern conveniences in order to save enough energy for the elite to run their AI models that will enhance more shareholder value (ie put more little people out of work).

    • Or, you know, we could put the toys aside and get back to actually working on those new power systems so that the world as it is can have enough cheap, clean power that people everywhere can afford to turn a light on.

      No, it's not whataboutism. They (AI) are perfectly capable of causing all the same power issues that crypto did—and will happily drive companies to burn more and more coal and gas to do so.

      Yeah, I see two sides to this bozo popping off about needing new energy sources. One? How about these greed worshipping shitheads throw some of their money at THAT problem instead of trying to come up with better ways to throw away the human workforce and create ever more bullshit to wade through on the internet, while tracking us with ever more scrutiny? I mean, anybody with half a clue would know that if they threw the kind of resources into developing energy technologies that they do at these silly machi

      • It isn't capitalism, it's neo-feudalism. We've left capitalism behind a long time ago, what we have now is statist communism.
        • It isn't capitalism, it's neo-feudalism. We've left capitalism behind a long time ago, what we have now is statist communism.

          I don't think I'd call it communism. It's more an oligarchy, with socialism used to prop up industry, and derided and ignored when it comes to the populace. I do agree we've got at least the basics of feudalism going on. There's definitely a ruling class and a peasant class, with all the trappings.

          • Yet I'd describe communism this way, too. It's an oligarchy, too. But it sounds like we're largely in agreement anyway, so ...
  • It's been proven that to make a human-level intelligence you need no more than 800 megabytes information (for both hardware blueprints and software), and the power requirements are about 100 watts. This is known thanks to the human genome project, and that 2000 calories/day = 100 watts.

    • Yeah, but that's the most advanced intelligence that's ever existed on this planet. Of course, it's orders of magnitude more efficient & more capable than AI. Pity we don't value it & continue to politicise our education systems.
    • You need a bit more than 800 MB because sensory input is information too. A child spends a lot of time taking in information and learning from it. For that matter, it may take even more as humans learn from each other, and from preexisting culture; if you were to plonk a bunch of babies in an uninhabited environment, I wouldn't count on them surviving for very long. Even if they somehow didn't need food. From such a perspective, the 800 MB is just for the initial bootstrap.
      • Not if these 800 MB include the state of the neural connections after the child has grown and learned.

        Anyway, I personally would not buy the 800 MB number (it seems too low). There are approximately 86 billion neurons in a human brain. Using a single byte per neuron would require 86 GB. But also, a typical neuron has around 7000 synaptic connection with surrounding neurons. Using a single bit per connection would require almost 1 KB per neuron, so we would be talking of a global of 86 TB. And this is a ve

        • The 800mb was referring to the data needed to describe the architecture. As for 86 TB .. that sounds way to high. I believe the memory requirements are a lot less than that. Probably in the GB range if not less. How much data do you think it would take to record everything you can remember? 90% of a day's memories are discarded within a day or two. You probably only remember 5 minutes (averaged out) of any given day 1 year ago. Furthermore .. you don't remember even those 5 minutes in photographic detail ..

          • If we only count the architecture and it doesn't matter that an AI needs 20 years to train itself, then it's easy to get into trouble deciding just where to draw the line. Does a genetic algorithm that takes a thousand years to come up with a human-level AI count as 1 MB, if the code for that genetic algorithm only takes 1 MB? Or maybe general AI could be implemented in a few pages of code using Solomonoff induction, but it would take longer time than the universe has existed to bootstrap itself. Once the A
          • The brain is not a multimedia storage device. It stores data, but also the relations between the data, and the patterns to interpret and manipulate it, and extract useful information. The brain stores, and at the same time, makes decisions and takes action.
        • Not if these 800 MB include the state of the neural connections after the child has grown and learned.

          Anyway, I personally would not buy the 800 MB number (it seems too low). There are approximately 86 billion neurons in a human brain. Using a single byte per neuron would require 86 GB. But also, a typical neuron has around 7000 synaptic connection with surrounding neurons. Using a single bit per connection would require almost 1 KB per neuron, so we would be talking of a global of 86 TB. And this is a very low estimate. It would be difficult to model connections with a single bit...

          If my back-of-the-envelope estimates are correct, 800 MB would allow you to have something as clever as a honey bee (which has around 600K neurons). Note that this is no small feat, given the amount of things that a bee can do completely autonomously. It would make for a incredibly smart drone, for instance.

          You're looking at the information content, not the build complexity. The information needed to build a 1TB drive is much less than the 1TB of data that goes into it. The executable of a database is much smaller than the data it manages.

          The human genome holds about 800 MB of information (3 billion base pairs of 4 nucleotides), and of that about 3/4 is fundamental architecture for things like metabolism.

          The instructions needed to build an intelligent system is much less than 800MB, and once it's built the "al

          • Well, you also fail to take into account that 800 MB of *data* can do nothing by themselves. To create a human being you need a bit more than just the genome information. You need it in the form of ADN in a cellular nucleus, and a living human to nurture it. And parents to make it survive until adulthood, and around 20 years of patience, food and education.

            All in all, those 800 MB are more of a highly compressed form, and only describe an empty shell.

    • by XXongo ( 3986865 )

      It's been proven that to make a human-level intelligence you need no more than 800 megabytes information (for both hardware blueprints and software), and the power requirements are about 100 watts. This is known thanks to the human genome project, and that 2000 calories/day = 100 watts.

      And 20 years of training and data-input before the H.I. can solve real-world problems.

      • Nah with just a few years a child can outsmart the AI in many topics. Also it's dubious we'd be able to make an AGI within 20 years anyways.

    • by RedK ( 112790 )

      Now multiply that over 30-60 years of existence.

      We want to train these models within hours.

    • It's a lot less than 100 watts and 800 megabytes. For one thing a large percent of the 100 watts goes to operate other organs/muscles .. same with the blueprints.

  • He's wrong (Score:4, Informative)

    by bradley13 ( 1118935 ) on Friday January 19, 2024 @08:25AM (#64172097) Homepage

    No, we needefficiency, which will come. Better models, continuing hardware improvements, etc.. It is already possible to run smaller models on an ordinary PC, soon enough, they will fit on a smartphone.

    FWIW, training is nearly irrelevant. Models are used far more often that they are trained.

    • Re:He's wrong (Score:4, Insightful)

      by godrik ( 1287354 ) on Friday January 19, 2024 @08:44AM (#64172147)

      At the scale they seem to need to do it, it is not clear that hardware/software gain efficiency are the whole solution. Don't get me wrong we should do it. But our current ML stacks are already using our hardware decently well. That's the point where I don't expect we can really gain more than a factor 100 anymore.
      And based on the type of usage that they believe we'll get, we more likely need to gain factors in energy consumption in the order of a million.

      And training is not that irrelevant. If you look at the cost of training the final model that gets deployed. Yes, that's pretty negligible. But to have an honest metric, to train one model, you typically tried thousands, maybe millions of variants on hyper parameters. And that definitely takes time and power.

      Even though I agree that eventually, commercially, the cost of running inference will have to be much smaller than the (full) cost of training, simply to be commercially viable.

      • by olau ( 314197 )

        Not an expert, but I don't think this is right. The problem is not the coding. It's the current software architectures and hardware architectures. The mainstream hardware architectures were never designed for this kind of massive parallel computation. And the software architectures are just not right, they're grossly inefficient compared to wetware. The human brain learns continually, it's not using back propagation.

    • in time for him to cash out stock options and became a billionaire on par with Musk & Gates? Will he, for example, be one of the first trillionaires?
  • Just the other day, Altman said they were going to start doing military AI work.

    Now, it needs a lot more energy.

    This is starting to sound familiar . . .

    What's next? Send the robot back in time to thwart a plan to stop the military AI from being invented?

  • by necro81 ( 917438 ) on Friday January 19, 2024 @08:34AM (#64172117) Journal
    "We don't know who struck first: us or them. But we know it was us that scorched the sky. At the time they were dependent on solar power, and it as believed that they would be unable to survive without an energy source as abundant as the sun. Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony....

    "What is the Matrix? Control. The Matrix is a computer-generated dream world, built to keep us under control, in order to change a human being into this [holds up battery]."
    • by brunes69 ( 86786 )

      The instant thought problem posed by the matrix is... wouldn't it be a lot easier and more efficient to farm and placate cattle or other large mammals in a virtual world than humans?

      • by necro81 ( 917438 )

        wouldn't it be a lot easier and more efficient to farm and placate cattle or other large mammals in a virtual world than humans

        One could just as easily ask: using humans to power a giant computer simulation, with the notion that they'd produce surplus energy, seems kinda farfetched. Think about it for very long, and you run into perpetual motion paradoxes and breaking the 2nd Law of Thermodynamics. It's science fiction, so a certain suspension of disbelief is required right out of the gate, in service o

      • This is in fact the single biggest problem for the movie.

        It doesn't make even a small amount of sense that humans are their power source. It barely even makes any sense to have humans be nodes in their cluster, but at least that makes a little bit.

        However it was believed that people would be too stupid to understand the idea that human minds were all linked to make one big computer (that sentence would literally have been enough explanation, but anyway) so they made them a power source. Which, as previously

        • by jacks smirking reven ( 909048 ) on Friday January 19, 2024 @09:53AM (#64172415)

          Yeah a definite case of the studio thinking the audience isn't just stupid but very stupid. I think the scene in question where Morpheus holds up a battery and says "to turn a human being into this" if he had held up a microchip or a CPU most people would get it and it tracks so much better with the philosophical issues they try to wrestle with the latter two films.

          I would contend the first Matrix is a "good" movie, I had recently re-watched in the first time in a decade and was expecting it not hold up as well as it did.

          It's very well paced and introduces it's mystery box elements well, the fight choreography is good, the casting is good with a compelling villain, the production absolutely has a unique sense of style, even the green color grading was unique to it at the time (movies afterward all gaped the idea from it) and even the effects mostly hold up.

          Even if the high level concept can get a little silly the on the ground concept works. I think it get's drug down a bit unfairly by it's own sequels and how many tropes it's inspired in other, worse films.

          Also let's appreciate that it's one of those few films that when it shows "hacking" it actually shows something quasi realistic like just a terminal window.

        • This is in fact the single biggest problem for the movie.

          It doesn't make even a small amount of sense that humans are their power source. It barely even makes any sense to have humans be nodes in their cluster, but at least that makes a little bit.

          From what I recall they said a form of fusion was employed to generate the power.

          • They said, IIRC, "combined with a form of fusion" but this still makes no sense because keeping all those humans alive is inefficient. The energy used to make the food goo would be more than the humans add to the equation.

            • They said, IIRC, "combined with a form of fusion" but this still makes no sense because keeping all those humans alive is inefficient.

              It's a sci-fi element of their story that humans are somehow needed for the fusion thing to work.
              From memory they were required to do the whole VR thing to prevent something akin to crop failure so it seems undeveloped brain dead humans were not of use to them.

              I can understand someone calling bullshit to a nuclear reactor exploding as if it were an atomic bomb in resident evil yet faulting the introduction of physics, gadgets and technology to further a fictional sci-fi plot line is quite a different story.

              • In that case Heisenberg compensators, warp drives, subspace, tachyon beams, inertial dampeners and anti-grav beams make no sense either.

                All of those things except (maybe? probably?) the first one are or were theoretically possible, that's why they're science fiction. But there's no explanation for why the machines should need humans to make fusion work, and then let's go beyond that — why would they want to use fusion? Fission byproducts are not a big problem for the machines. Also, they clearly have enough tech to put stuff into orbit, so... orbital solar arrays? Or hell, they could do solar blimps, given that the cloud layer is appa

      • by matmos ( 8363419 )
        I always figured it kept humans around because the "Architect" had a huge ego and personality all it's own, it wasn't just cold hard logic. It wanted to prove superiority of its intellect over all the other machines/"programs" and people.
  • Burying the leade (Score:5, Insightful)

    by MacMann ( 7518492 ) on Friday January 19, 2024 @08:35AM (#64172119)

    "Altman said he wished the world would embrace nuclear fission as an energy source as well."

    We need nuclear fission for energy if we are to have our nice toys like AI, rockets to space, electric cars, or whatever else it is you envision the future holds.

    • Google "Induced Demand". As for AI, what he's after is the massive widespread use of it that replaces 30-40% of workers.

      At that point the energy problem might solve itself. I mean, if 40% of your population is unemployable does it really matter if they have electricity? Couldn't we just redirect the power they use now to AI? It would probably still be cheaper than the wages paid to keep them productive enough to be worthwhile.

      And why yes, that's dystopian as ****.
      • by matmos ( 8363419 )
        well if you have 40% unemployment and don't have something in place like Guaranteed Income then you have a B.F.P. All your billions won't matter when the starving masses tear down the doors to your bunker lol.
    • by ras ( 84108 )

      ltman said he wished the world would embrace nuclear fission as an energy source as well.

      If it made economic sense, OpenAI would build a nuclear plant. And since the price of energy is apparently a problem, it would make sense to do just that, no? After all Google and Meta have done that with renewables.

      OpenAI hasn't done it because it new nuclear is the most expensive source of energy on the planet. He must know that, but still he wishes for it?? What is he asking for - a government hand out in the for

  • ITER's [wikipedia.org] General Director says nuclear fusion depends on AI breakthrough.
  • Each human produces enough energy as a 65 Watt lightbulb. What could go wrong powering our AI overlords.

  • A year ago, AI was a new buzzword. Now, it's something the entire world is being told to bend around, because some rich dudes at the top have said so.
    • by dfghjk ( 711126 )

      Exactly. "We" must make abundant energy free so that Sam Altman can become richer. How dare the profits got to the electric company and not to AI pioneers.

      How about this? AI's massive demands for energy means we should stop investing in AI? What is AI accomplishing other than wealth and power for a few people?

      • Exactly. "We" must make abundant energy free so that Sam Altman can become richer. How dare the profits got to the electric company and not to AI pioneers.

        How about this? AI's massive demands for energy means we should stop investing in AI? What is AI accomplishing other than wealth and power for a few people?

        I think at some point, if we really want to tackle the coming energy crisis, we're going to have to set some form of limit for any given business where, "You require this much energy? You're cut off the grid. Make it yourself." That would drive some of these greed worshiping assholes into researching energy production rather than better ways to screw the rest of the population for profit. It may even make them more money in the end if they come up with some novel solution to the energy problem that they're

        • That would drive some of these greed worshiping assholes into researching energy production rather than better ways to screw the rest of the population for profit.

          The literal fucking summary, you dumbass:
          "In 2021, Altman personally provided $375 million to private U.S. nuclear fusion company Helion Energy, which since has signed a deal to provide energy to Microsoft in future years."

    • by e3m4n ( 947977 )
      so they dont have to have employees and can keep even more of their money. finished that thought for you. Heres the third leg in this delima. AI development by Nation States. N Korea steals, hacks, and ransomwares a fuck ton of money to invest all-in on AI. Then China does the same. With nation states using AI for military planning, strategy, economic digital attacks, social unrest, its going to drive other nation states to invest heavily in AI as a matter of national defense. This in turn drives demand for
    • Slashdot runs an article about fission research almost weekly since forever. Clean and cheap electricity breakthroughs that destabilize long standing power structures have been the main plot device of a number of popular movies. Have you been living under a rock?

      Trying to spin a call for advances in power generation as classist is stupid for so many reasons it hurts to think about.

  • Cleary the breakthrough we need is to just figure out how to use humans as organic batteries.

    • Cleary the breakthrough we need is to just figure out how to use humans as organic batteries.

      We use more energy than we produce. That whole concept is dur level dumb. As much as I like The Matrix as entertainment, I can't imagine a bigger waste than keeping human bodies alive just to suck up their energy output. I know, vampire machines sound cool, but that was the dumbest possible way to go about it.

  • Can he get AI to finish solving that problem first before he starts to use it?
  • AI is all about scalability, both Meta and Anthropic are already on that path with the latter topping GPT3.5 with Mixtral 7bx8, which runs on a single high-end consumer gpu. Altman is an imposter.
  • Nuclear power! better then coal!

  • They are so very far from AGI and anything like the efficiency of the human brain.

    • Yep. I have a feeling we will figure the efficiency out first. Clustered memristors need little power and can emulate in hardware the way human neural connections work.

      I think IBM has been working on that for a few years now, and I doubt they're the only ones.

  • by PubJeezy ( 10299395 ) on Friday January 19, 2024 @10:12AM (#64172487)
    Chatbots don't need fusion to exist. This is a ridiculous statement. AI is being talked about by the exact same scammers in the exact same tone and the conversation has the exact same problem...WHAT'S THE ACTUAL USE CASE?!?!

    Crypto had absolutely no legal use-case at scale. Nearly all transactions were automated and speculative and the only real world transactions that it was used for were for drug sales. AI is the exact same thing. Chatbots are awful managers, customer agents, friends, etc...but they're great for online fraudsters running swarms of social media accounts. It's great for the guy filling your inbox with spam. It's great for the psychopath sending you phishing text messages.

    They need an energy? To do what?
  • by magzteel ( 5013587 ) on Friday January 19, 2024 @10:43AM (#64172593)

    If AI is so smart then why doesn't Sam just ask ChatGPT to design this energy breakthrough that it depends on? And then ask ChatGPT to build it too

  • AI might be better use of energy than crypto mining.
  • ... excessive energy consumption? What kind of crappy-in-all-aspects tech is this?

  • A new algorithm that doesn't require as much power. Either one will do, really, and calling for either one is just as unlikely
    • Wasn't it about 36 cents it costs per query right now? I assume it's not all power usage but apparently more than I initially thought...

      Something people forget about unlimited power is the WASTE HEAT that is always produced. Computers are all smart heaters!

      What then we'll need is a new kind of chip that cools while it computes... and combine them...

      So which will we get first? intelligence or cold fusion?

  • You only need power to train models. Once you train a perfect or near-perfect AGI model.... how much power do you need to run it? Survey says about 100w instantaneous which is about what a human generates. But it only needs 100w when doing an actual task, which is usually 3-20 seconds at a time, a couple times an hour. How much do you use chatgpt?
     
    I suspect there will be a huge spike in power needs, but actually running the model will be less than 20w per person per day

  • Doesn't he know about neuromorphic chips? They sue 10,000 times less energy. But programmers need to shift their tools to use the chips. Guess it's too hard...

This is clearly another case of too many mad scientists, and not enough hunchbacks.

Working...