Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Software Stats Supercomputing Hardware

World's Most Powerful Private Supercomputer Will Hunt Oil and Gas 135

Nerval's Lobster writes "French oil conglomerate Total has inaugurated the world's ninth-most-powerful supercomputer, Panega. Its purpose: seek out new reservoirs of oil and gas. The supercomputer's total output is 2.3 petaflops, which should place it about ninth on today's TOP500 list, last updated in November. The announcement came as Dell and others prepare to inaugurate a new supercomputer, Stampede, in Texas on March 27. What's noteworthy about Pangea, however, is that it will be the most powerful supercomputer owned and used by private industry; the vast majority of such systems are in use by government agencies and academic institutions. Right now, the most powerful private supercomputer for commercial use is the Hermit supercomputer in Stuttgart; ranked 27th in the world, the 831.4 Tflop machine is a public-private partnership between the University of Stuttgart and hww GmbH. Panega, which will cost 60 million Euro ($77.8 million) over four years, will assist decision-making in the exploration of complex geological areas and to increase the efficiency of hydrocarbon production in compliance with the safety standards and with respect for the environment, Total said. Pangea will be will be stored at Total's research center in the southwestern French city of Pau."
This discussion has been archived. No new comments can be posted.

World's Most Powerful Private Supercomputer Will Hunt Oil and Gas

Comments Filter:
  • by Anonymous Coward

    Try again editors, leave your geek card at the door.

  • by Anonymous Coward

    2.3 gigaflops is faster than 831 tflops?

  • 2.3 gigaflops? (Score:4, Insightful)

    by shoppa ( 464619 ) on Monday March 25, 2013 @12:26PM (#43273223)
    2.3 gigaflops is on most everyone's desktop today. Maybe you mean 2.3 teraflops?
    • by shoppa ( 464619 )
      Crap, I got it wrong too. Not 2.3 teraflops either. 2300 teraflops = 2.3 petaflops.
    • by Anonymous Coward

      Nope! The article is right. They just bought a used Dell and named it Panega.

      Slow news day.

  • Amateurs (Score:1, Insightful)

    by Anonymous Coward

    If they really want to destroy the planet the fastest through global warming, go build a shitload of limestone quarries, and then dump the limestone in rivers. I'm sure they could get us to Venus levels in a couple of decades.

    • Re:Amateurs (Score:4, Insightful)

      by interval1066 ( 668936 ) on Monday March 25, 2013 @12:51PM (#43273585) Journal
      I couldn't care less about that nonsense. I'm waiting for the day they use this system to hunt humans. Dissenters, etc...
    • Go and revise some basic level - as in teenage schooling - chemistry. Dumping limestone (essentially calcium carbonate) into a river is, if it does anything, going to reduce atmospheric carbon dioxide. But it's not actually going to do very much. If you dumped it into acidic river water (mine drainage, or peat bog drainage) then you'd get a little CO2 out, but not a large amount. Most river waters are, if anything, a touch alkaline.

      Not just an AC, but a dumb AC.

  • by fgodfrey ( 116175 ) <fgodfrey@bigw.org> on Monday March 25, 2013 @12:27PM (#43273231) Homepage

    This has to be 2.3 *peta* FLOPS not giga FLOPS. For instance, in 2010, an Intel desktop processor could do 109 gigaFLOPS (reference: http://en.wikipedia.org/wiki/FLOPS [wikipedia.org]).

    • This has to be 2.3 *peta* FLOPS not giga FLOPS. For instance, in 2010, an Intel desktop processor could do 109 gigaFLOPS (reference: http://en.wikipedia.org/wiki/FLOPS [wikipedia.org]).

      With any relevant workload? Hardly. You just won't get the memory bandwidth with your puny Intel CPU. But I guess it's good for writing flashy ray tracers...

    • This has to be 2.3 *peta* FLOPS not giga FLOPS. For instance, in 2010, an Intel desktop processor could do 109 gigaFLOPS (reference: http://en.wikipedia.org/wiki/FLOPS [wikipedia.org]).

      Ah, but then the computer will not get work done because it is too busy trying to convert us to veganism. :-)

  • Oil (Score:2, Insightful)

    by Anonymous Coward

    So.... why are we wasting the most powerful computer on non renewable sources of energy?
    Hunting for big oil just seems... wrong. Why cant we use this computer to find cures, track the stars, simulate atoms?

    • Why cant we use this computer to find cures, track the stars, simulate atoms?

      Why, you'd like to try to split the computer?

    • Because oil is where the money is. And as it gets more scarce, it will be even moreso.

      Ever wonder why nobody seems to fight for ending hunger, or world peace, or finding alien life? Because I know you can't buy a yacht and a mansion on every continent by keeping people from starving to death.

      • Re:Oil (Score:4, Insightful)

        by the eric conspiracy ( 20178 ) on Monday March 25, 2013 @12:49PM (#43273567)

        That's because there is plenty of food. The problem is political, not technical or economic.

      • ...you can't buy a yacht and a mansion on every continent by keeping people from starving to death.

        Huh? You mean nobody at Monsanto owns a yacht, or a mansion on every continent?

        • You mean nobody at Monsanto owns a yacht, or a mansion on every continent?

          Probably only MBAs and other such parasites. The actual people who provide the intellectual capital to the company are unlikely to be startlingly well paid.

          And I doubt that many MBAs have mansions on the Antarctic continent.

    • Re: (Score:1, Troll)

      by nedlohs ( 1335013 )

      Because it is their money and that is how they want to use it.

      Feel free to build your own and use it for whatever you want as well.

    • How would a supercomputer help solar panels make more power?
      1. I didn't realize computers were smart enough to analyze cancer and find the magic cure. So that's what we've been doing wrong all these years!
      2. I can use my home desktop to track stars. There's no variables involved here, just constants. Remember, they were tracking stars before computers, linear algebra, and calculus were invented.
      3. And atom simulators can run on a sub $5,000 workstation.

      • by manicb ( 1633645 )

        Hi. I use supercomputers to model materials for next-generation solar cells. If you're working on semiconductors, you need to probe the electronic structure of crystals with junctions and defects; basic atomistic calculations aren't going to be enough (although they have their own uses and are definitely part of the picture). These calculations help us to work out why some promising materials don't perform well in experiments, and figure out which reactions will be suitable for cheap large-scale production

    • Because its not your computer.

      We are not yet at a place where "the public" can just waltz in and demand the use of a privately owned super-computer.

    • Hunting for big oil just seems... wrong.

      Indeed. Drilling for it seems much more appropriate.

      • Long before (typically a decade or several before) you get to putting a drill bit into the ground, you need to find the oil. For which you need seismic data, some serious number crunching to analyse that data, and a lot of brain sweat.
        • But no rifle, bow or other hunting weapon, no dogs, nor anything else usually needed for hunting.

          Or in short: Whoosh.

          • woosh indeed.

            for a joke to be effective, it needs to be recognisable as a joke. Your (near) namesake Marijke sometimes tripped up over this too. Gone, but not forgotten (partly because she died in possession of several of my books).

    • So.... why are we wasting the most powerful computer on non renewable sources of energy?

      Firstly "we" are not doing anything with this computer. A bunch of businessmen have decided to try to make more money for their shareholders by investing the investor's money in this exploration tool. I believe that this company has been working in hydrocarbon exploration and production for quite a while - certainly longer than I've been alive and probably longer than you've been alive, so there's probably a pretty good

  • Will the public have access to the results (thinking property values) or will it only be available to a select few?
    • by Garin ( 26873 )

      No, the public won't see these results as a rule, at least not right away (while it's still commercially valuable to protect as a secret), though strictly speaking it depends on the countries involved. Nor would it matter for property value, as the "land owners" usually don't own the mineral rights in most places. Furthermore, this setup will be used probably mostly for marine/offshore seismic imaging, ie not much land involved.

  • Right now, the most powerful private supercomputer for commercial use is the Hermit

    Except for that 600 petaFLOP private supercomputer [bitcoinwatch.com] for commercial use which also happens to be the most powerful computer ever constructed by mankind for any purpose.

    • by h4rr4r ( 612664 )

      How about useful commercial work?

      How about just less single purpose?

      It is not like you can take a bitcoin rig and do anything worthwhile with it.

      • It is not like you can take a bitcoin rig and do anything worthwhile with it.

        I'm sure that every supercomputer can be broken down until you get a subassembly that is not capable of anything worthwhile by itself.

        Bitcoin, as the largest distributed supercomputing project ever, is doing an increasing amount of useful commercial work [pymnts.com] every day.

        • by h4rr4r ( 612664 )

          I don't see the useful part of that. It is still just more fake money printing and scamming.

          Sure, but a whole bitcoin rig does nothing useful, is my understanding. Is there an actual use for those ASICs?

        • Bitcoin, as the largest distributed supercomputing project ever,

          Citation needed.

          Not just a citation, but some evidence too.

          • Citation needed.

            Not just a citation, but some evidence too.

            A year ago I would have provided links for you, but now I'll just suggest the burden of proof rightfully belongs to anyone claiming that a project larger than Bitcoin exists.

            • Five, maybe ten, years ago, the same claim was made concerning SETI@home. That was a project which had a lot of nerd support, and was intellectually respectable, so people like admins for university networks could (and did) put the client onto blocks of thousands of machines which would otherwise be idle during lectures. And if the $LIBRARY$ (or whereever else) got to turn down the heating for X $$$$/day while turnng up the mains utilty usage for a simialr amount ... well that's "swings and rounabouts" in t
              • You are seriously out of the loop in this area.

                The processing power devoted to Bitcoin mining is approaching two orders of magnitude more than Titan [wikipedia.org].

                The institutions you are talking about have already been left in the dust before they even realized something was happening. Even if any of them decided to compete in this area Bitcoin would pull ahead another order of magnitude before they could even finish the contracting process.

                • The processing power devoted to Bitcoin mining is approaching two orders of magnitude more than Titan [wikipedia.org].

                  Why did you drag this "Titan" thing pop out of the woodwork? It's a lump of ironmongery designed by a committee and funded by a bunch of small-dicked gun nuts. Nothing interesting there.

                  Anyway, no comment on some verifiable figures for the size of the Bitcoin network, so it's safe to assume that you don't know the answer. Probably something hugely overblown if it's got such little verifiabl

                  • Probably something hugely overblown if it's got such little verifiable about it.

                    Quoted for posterity. Thanks for the comedy.

      • It is not like you can take a bitcoin rig and do anything worthwhile with it.

        I'm typing this slashdot comment right now with it. My graphics card is mining in the background, in parallel with providing the display for both monitors. The displays only slow it down about 2%, and playing video on the larger monitor eats about 1/6 of the mining speed. If I was playing a heavy duty video game I would have to shut down the mining program, but most other PC tasks can work just fine in parallel.

    • If you are going to include distributed systems, then google's million or so servers probably takes the prize for most powerful. We don''t know for sure, since they don't publish details about their data centers.

  • by Anonymous Coward

    I bet cold-pressed humans are a wonderful source of hydrocarbons.

    • You lose.

      They're very wet. You'd do better with olives. Or sunflower seeds. Or even (shock, horror!) oilseed rape.

  • by Anonymous Coward

    Quite an impressive system in general terms, 2.3PF without accelerators says a lot about the size of this machine (48 racks):

    "Pangea is manufactured by SGI, built on the ICE-X platform. In a video, Total said that each blade contains four Xeon processors (most likely the E5-2600, which SGI uses), each with 32 cores and 128 Gbytes of RAM. Each M-Rack contains 72 blades, for a total of 288 processors, 2304 cores, and 9 TB of RAM. An M-Cell contains four M-Racks and 2 C-Racks for 288 blades, 1,152 processors,

    • A system of this size with accelerators would exceed easily 10PF, although I am not sure whether the particular workload to be ran on this beast would be suitable for any kind of accelerators (anybody has an idea on that?).

      Most of the work will be processing seismic data, which is highly parallelisable. Start with a microphone and an air gun ; fart the air gun into water and record the echoes with the microphone. Then invert the recording to get a first guess at how far down the contrasts of acoustic imped

  • by Anonymous Coward on Monday March 25, 2013 @12:53PM (#43273621)

    Oil and gas?! How about Solar? Wind? GeoThermal? LFTR/Thorium? Why are we blowing computer power on dying industries!?

    What is this?! The Freakin' Flintstones?!

    • by shadowrat ( 1069614 ) on Monday March 25, 2013 @01:03PM (#43273759)

      Oil and gas?! How about Solar? Wind? GeoThermal? LFTR/Thorium? Why are we blowing computer power on dying industries!?

      Well, it seems like a waste of computing power to use it to find those things. I mean, we already know where the sun is. We already know where it's windy.

      • Well, it seems like a waste of computing power to use it to find those things. I mean, we already know where the sun is. We already know where it's windy.

        Actually, we could know this much better, but it's remarkably hard to figure out without dropping a kajillion weather stations.

        • by khallow ( 566160 )

          Actually, we could know this much better, but it's remarkably hard to figure out without dropping a kajillion weather stations.

          At a few thousand dollars per automated weather station (including low data satellite link and solar power system), it doesn't take long to blanket the US especially given how the US government spends money.

      • by tyrione ( 134248 )

        Oil and gas?! How about Solar? Wind? GeoThermal? LFTR/Thorium? Why are we blowing computer power on dying industries!?

        Well, it seems like a waste of computing power to use it to find those things. I mean, we already know where the sun is. We already know where it's windy.

        Use that power to simulate solar energy collection systems for optimal designs, to modeling material science simulations to develop more efficient capture systems, to producing potential scenarios for climate change expansion and where to best target a means to reversing it, etc.

      • Oil and gas?! How about Solar? Wind? GeoThermal? LFTR/Thorium? Why are we blowing computer power on dying industries!?

        Well, it seems like a waste of computing power to use it to find those things. I mean, we already know where the sun is. We already know where it's windy.

        Waste of computing power? Most likely.

        Waste of money to those invested in it to help feed a greedy and corrupt oil industry? Not hardly.

        Just follow the money. Oddly enough, it always seems to lead to the true answers.

      • by olau ( 314197 )

        Actually, you laugh, but Vestas, the largest wind turbine manufacturer in the world, bought a supercomputer two years ago [vestas.com].

        They use it to run simulations of wind conditions back in time. In some sense terrain-aware interpolations of existing measurements.

        The thing is that when you're trying to figure out the economics of putting up a wind turbine, it doesn't really help you terribly much that you know the wind conditions 20 miles away in the nearest town. They've built a tool on top querying the simulated da

  • by jones_supa ( 887896 ) on Monday March 25, 2013 @01:01PM (#43273717)
    How can sole number crunching power find the actual oil/gas reservoirs?
    • by Garin ( 26873 ) on Monday March 25, 2013 @01:10PM (#43273857)

      Seismic imaging. Imagine solving a wave equation (acoustic, elastic, or worse) over a 3D grid many kilometers on a side with grid spacing on the order of meters. Imagine you're doing it with a strong high-order finite-difference code. Calculate for tens of thousands of timesteps. Now repeat that entire thing thousands of times for a given full survey.

      No matter how much computer you have, it's never nearly enough for seismic imaging.

      • Seismic imaging. Imagine solving a wave equation (acoustic, elastic, or worse) over a 3D grid many kilometers on a side with grid spacing on the order of meters. Imagine you're doing it with a strong high-order finite-difference code. Calculate for tens of thousands of timesteps. Now repeat that entire thing thousands of times for a given full survey.

        No matter how much computer you have, it's never nearly enough for seismic imaging.

        Exactly right! Geophysics (and particular oil & gas) was one of the primary driving forces around supercomputer design, and computing in general. As an aside: Many of you are probably not aware, but TI started out a geophysical instrument manufacturer.

        • Many of you are probably not aware, but TI started out a geophysical instrument manufacturer.

          Gravity meters, wasn't it?

      • Much better explanation than I had! :)

        I used to work for an HPC vendor, about 1/3 of our sales were to oil/gas. From what I understood it was pretty easy for our customers to parrallelize their workload to run across thousand of off the shelf servers with gigabit ethernet vs having to buy a much more expensive setup with high speed interconnects.
        • by Shinobi ( 19308 )

          Then you understood it wrong. Seismic imaging is one of those tasks that benefit from SSI etc, and in the absence of that, high-speed&low-latency interconnects such as Infiniband.

          The higher resolution of the grid, the more data you need to pass back and forth, at high speed, with as little latency as possible. Each cell needs to exchange data with neighbours, and the datasets are very large.

          Back when I started out with HPC, this was one of the tasks where a 128 CPU Origin 3000 SSI with 128GiB shared RAM

          • by Garin ( 26873 )

            Nah, he's not wrong. But neither are you.

            Seismic processing is about as embarrassingly parallel as it comes. Just about every processing step can be split up into e.g. single shot record steps, taking advantage of assumed linearity in wave equations. Furthermore, most production industrial imaging codes weren't actually using my original example of a full finite difference solution until quite recently, and instead they were using algorithms that have been developed for decades under the limitations of very

            • by Shinobi ( 19308 )

              Well, the point he was trying to make was that the pile of OTS+gig-E outperformed the "traditional" approach, and mine was that, no, it didn't. It worked well enough for many, but at the leading edge, pushing SOTA, the traditional shared-memory setups worked far better. And nowadays, if you try to push the SOTA, you need Infiniband or similar, because gig-E and even 10gig-E will choke.

              And yes, it can easily be divided into cells, but you still have to pass that data around, and that's where the interconnect

              • Maybe another possible area of confusion is that oil and gas companies use big computers for things other than seismic imaging. For instance, linear programming for refinery planning.

      • >> Imagine solving a wave equation (acoustic, elastic, or worse) over a 3D grid many kilometers on a side with grid spacing on the order of meters. Imagine you're doing it with a strong high-order finite-difference code. Calculate for tens of thousands of timesteps. Now repeat that entire thing thousands of times for a given full survey.

        I put on my robe and wizard hat.

    • by toastar ( 573882 ) on Monday March 25, 2013 @01:15PM (#43273929)
      Processing Seismic Data takes a ton of power, There are techniques that are well known that we still can't use due to the lack of computer power. The last big advance was RTM(Reverse Time Migration). This was first done on 2D Data in the 80's, But didn't become reasonable to do on 3D's until about 2008-'09. This improvement in imaging is one of the drivers is subsalt exploration. The next big step is FWI (Full Waveform Inversion) We still don't have enough power to run this mainstream yet, The main idea is the stuff we mute out as noise is actually just data that we can migrate back to the original location. The other Item more power helps us with Is running Migrations at higher frequencies. right now we record at 250Hz(125 nyquest) but only process at 60Hz, This is mainly due to the price of computer time. doubling to 120Hz requires 4 times more computer time. But allows us to double our image resolution from to 50 meters to 25 meters. Considering some of our target reservoirs are as narrow as 20 feet, This type of thing is important.
      • The other Item more power helps us with Is running Migrations at higher frequencies. right now we record at 250Hz(125 nyquest) but only process at 60Hz, This is mainly due to the price of computer time. doubling to 120Hz requires 4 times more computer time.

        Worse than that, doubling the frequency requires 16 times the computer time, as you need to halve your spatial sampling interval to prevent aliasing. Assuming we're talking about wave equation based techniques such as RTM.

        This is one of those areas which is still very much "no matter how powerful the computer, we'll use it all".

      • Interesting!

        So with exponentially growing computing power, how does that affect the rate we find new resources?

      • Considering some of our target reservoirs are as narrow as 20 feet,

        Wow, you've got some thick reservoirs! I was reminiscing a couple of days ago about a geosteering job I did trying to run an 8.5in drill bit along a 6in storm sand deposit. But that was a bit extreme.

  • we already know about far more oil reserves then we can wisely burn. finding more is only going to make things worse.

    • I don't think it makes things worse. It just doesn't do anything - at least in the short-tem - to solve the problem.
      • by manicb ( 1633645 )

        Uh, yes, it absolutely makes things worse. See Limits To Growth [wikipedia.org] and read up on the concept of overshoot. By doing permanent (or quasi-permanent) environmental damage now we jeopardise the possibility of every reaching a stable condition with a decent quality of life, and extending this aggrevates the problem hugely.

        • Uh, I should have been more specific. . .SEARCHING for oil and gas doesn't make things worse, and the article is talking primarily about the SEARCH. Of COURSE, actually extracting the oil/gas would make a significant environmental difference. But depending on where the oil is found, regulatory pressures may keep them from ever actually extracting it. There are already known oil and gas reserves that are a long ways from being extracted due to regulatory restrictions.
    • by Anonymous Coward

      Petroleum is used for many other things than being burned. Even if we switched to 100% renewable energy tomorrow, there would still be a demand for this sort of work.

    • As the nuclear arms race back in the Cold War proved, making the Earth uninhabitable only once is hardly sufficient: You have to make the Earth uninhabitable 25 times over just to be sure.

  • Every once in a while, I hear of an owner taking their company's new toy and having a little fun with it. I wish I owned this company just so I could run WCG or Folding@Home on it for a week.

    Maybe I could sneak in and add a WCG daemon. I wouldn't be stealing resources, just.. liberating.. idle.. ones.
  • I noticed they will not be using it to find ways to conserve energy.

    • Those searching for oil are rarely in the same camp as those looking to conserve it.
    • You don't need to find ways to conserve energy. Energy conservation is built right into the rules of the universe.

      Of course conserving oil would be a good idea.

      • by khallow ( 566160 )

        Of course conserving oil would be a good idea.

        Using oil productively is a good idea too. Maybe even a much better one than "conserving" oil is. Depends on what you mean by "conserving oil".

  • reinvested to build a supercomputer placing 9th...
    http://cnsnews.com/news/article/frances-total-gets-oil-price-profit-boost [cnsnews.com]

    Very smart way to spend that money even though I'm not a big fan of hydrocarbons. I'm really curious how hard was it to justify for Board Members?

    • Very smart way to spend that money even though I'm not a big fan of hydrocarbons.

      Actually I'm a big fan of hydrocarbons. It's a shame that we burn them.

      • by csubi ( 950112 )

        Fully agree - sad how hydrocarbon became associated in my mind with zombies sitting in endless traffic jams in the WashDC area...

  • by mrbongo ( 926079 ) on Monday March 25, 2013 @02:49PM (#43275027)
    There are actually quite a few of these big machines. Most of them in Houston, but some in Europe. Every major Oil Company and Every large Seismic company has one. They are all huge, and I have never seen on of them shut down to run benchmarks, and most folks don't externally advertise their existence. The cost too much and they have too much backlog and will never appear on a bullshit benchmark web page reserved for underutilized supercomputers. To the person asking if these are overkill? No, The folks referencing the RTM, FWI etc have hit the equations on the head. One processing job may take 6 + months to run a single migration using 20,000 + cpus. They run all kinds of cpus' gpu's and change out masses of them every time there is a step change in a chip for efficiency. If they had chips 100 times more powerful, they have equations waiting for them. with regards to the person or people talking about carbon ending it all etc.. These machines enable the reservoir engineers to target more reservoirs and then deplete these reservoirs more efficiently leaving less hydrocarbon behind (theoretically reducing the number of dry wells) We will never run out of oil, we will however run out of the technology to efficiently extract it from the ground. ( or it will become cost prohibitive) Carbon use however is another kettle of fish. Making hydrocarbon more expensive will only push coal back front. (look at china, germany etc) Until use is addressed, alternative will be what they could be. Doing things like shooting ourselves in the foot with ethanol is a good way not to proceed though
  • I know that the fossil fuel industry is narrowly focused on finding more hydrocarbons to burn since they can count these as assets and run up their stock price, but there is growing consensus among climate scientists (not deniers) that we can only burn about one third of the hydrocarbons that we have already discovered if we are to avoid climate catastrophe.
    It would be nice if raw greed didn't run the world. However, reality will intrude sooner or later and all of these new discoveries will become worthles

    • It would be nice if raw greed didn't run the world. However, reality will intrude sooner or later and all of these new discoveries will become worthless.

      The problem is: How many people die before reality intrudes?

    • by khallow ( 566160 )

      but there is growing consensus among climate scientists (not deniers)

      In other words, there is "growing consensus" among the group that already agrees that climate change is going to be really bad, that it is going to be really bad. In the real world, we would call this, observation bias.

      However, reality will intrude sooner or later and all of these new discoveries will become worthless.

      What's the mechanism? All I've heard is somewhat higher sea level, a little acidification of the oceans, slight changes in rainfall patterns, and modest rise in global mean temperature, all over long periods of time as humans see it.

      Meanwhile real problems like population growth, political

  • What is the Government doing with such powerful supercomputers?....

    I'm sure it isn't Pong.

  • The cleaner and faster we can find such minerals, the better....

    Ferret
  • The post just lifts the first four paragraphs of TFA and randomly adds in misspellings of its subject.
    Personally I feel uncomfortable with the lack of even a slight attempt to summarize, unless you count deleting carriage returns.

  • ... especially if it's not there.

"Gotcha, you snot-necked weenies!" -- Post Bros. Comics

Working...