Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Power The Internet

Internet Uses 9.4% of Electricity In the US 271

ribuck writes "Equipment powering the internet accounts for 9.4% of electricity demand in the U.S., and 5.3% of global demand, according to research by David Sarokin at online pay-for-answers service Uclue. Worldwide, that's 868 billion kilowatt-hours per year. The total includes the energy used by desktop computers and monitors (which makes up two-thirds of the total), plus other energy sinks including modems, routers, data processing equipment and cooling equipment."
This discussion has been archived. No new comments can be posted.

Internet Uses 9.4% of Electricity In the US

Comments Filter:
  • about World of Warcraft, a fictitious "country", using 10x more electricity than a real country, Vanuatu?

    i actually just pulled that factoid out of my ass, but i'd bet good money, considering this research on the Internet and power usage, that it is true after all

    Save Vanuatu! Unplug WoW!
  • by mind21_98 ( 18647 ) on Thursday September 27, 2007 @05:08PM (#20775197) Homepage Journal
    By how much would our energy use go down if we transitioned to servers and network equipment that use less energy? 9% seems like an awful lot to me, especially since the US relies on coal for its power production (something that generates lots of CO2)
    • by Original Replica ( 908688 ) on Thursday September 27, 2007 @05:51PM (#20775757) Journal
      By how much would our energy use go down if we transitioned to servers and network equipment that use less energy?

      The first place I would look to conserve energy is turning things off as opposed to standby. Televisions use 23% of their annual electricity while in standby, for VCRs that jumps to 50%. http://www.eere.energy.gov/buildings/info/documents/pdfs/lbnl-42393.pdf [energy.gov] So if we turned monitors and computers and wireless routers and printers etc, completely off when we were not using them the savings would likely be significant. As an added bonus your computer can't be a zombie spam bot when the power is turned off.
      • by mh1997 ( 1065630 )

        The first place I would look to conserve energy is turning things off as opposed to standby. Televisions use 23% of their annual electricity while in standby, for VCRs that jumps to 50%. http://www.eere.energy.gov/buildings/info/documents/pdfs/lbnl-42393.pdf [energy.gov] So if we turned monitors and computers and wireless routers and printers etc, completely off when we were not using them the savings would likely be significant. As an added bonus your computer can't be a zombie spam bot when the power is turned off.

        • by EaglemanBSA ( 950534 ) on Thursday September 27, 2007 @08:42PM (#20777235)
          I can count 7 LED's on my equipment where I'm sitting that I could pretty much do without...that doesn't really amount to much. LED's don't consume very much power at all (especially the low-power type found in electronics equipment) - you're talking on the order of a couple volts, and in the tens of milliamps. Example: Digikey # 67-1047-ND has a peak voltage of 2 volts, and a max current of 30 mA. At peak conditions, this puts out .06 Watts. Convert that to energy spent in a year, and it amounts to (assuming 8766 hours per year) about a half kilowatt-hour, or 5 cents, if you left that one LED on constantly, at peak, all year. That's 35 cents for all the LED's in my room, a total of 3 and a half kilowatt hours (a typical home burns about a thousand per month, or 12,000 per year). I'll just switch my computer off.
          • by jabuzz ( 182671 ) on Friday September 28, 2007 @03:12AM (#20779541) Homepage
            Replace that with a low current LED example RS 180-8495, 2mA and 1.8V or 0.0036W. There is nothing inherently wrong with standby. The problem is that the current designs are as cheap as possible not as low power as possible. You could design a circuit to bring say a TV out of standby with a remote that consumes less than 100mW easily. It would cost more but is perfectly do able.

            What is required is legislation to mandate that say standby can consume no more than 1W, then crank it down over the years. Another one would be legislation to for minimum levels of efficiency in power supplies, 85% would be a good starting point, and then crank it up over the years.
            • by Eivind ( 15695 ) <eivindorama@gmail.com> on Friday September 28, 2007 @05:40AM (#20780161) Homepage
              Or a milde variant of the same: Giving a clear grade and mandate displaying the grade prominently.

              Already the case in most of Europe if you buy a dishwasher, fridge, washer, drier or lots of other household-appliances.

              There's a grade for energy-efficiency, where the average for that kind of appliance is a "C" whereas an appliance that uses 30% less than average would earn an A, and an appliance that wastes 30% more energy than average earns an "F".

              The stuff has been a huge success -- to the point where appliances that don't rate atleast a "B" are just not marketable at all.

              The standard gets stricter automatically: As more and more people buy the energy-efficient appliances, the *average* efficiency improves, so the energy-usage for a "C" gets adjusted accordingly.

              Works like a charm.

              Some appliances have more than one grade, they grade efficiency on more than one scale. A dishwasher may have a note on it saying:

              Energy consumption: A
              Water consumption: B
              Wash effectiveness: A
              Drying effectiveness: A

              So, I don't see why a modern TV couldn't be sold with; "Energy consumption: A", "Standby consumption: B".

      • A VCR or DVR that is truly powered off is not going to be much use, however, since it won't turn on to record the shows you want to watch.
    • CO2? Don't forget radioactive particles. Coal ash contains trace amounts of the radioactive metals uranium and thorium, along with toxic elements such as cadmium.
  • Close to accurate? (Score:3, Informative)

    by Kazrath ( 822492 ) on Thursday September 27, 2007 @05:09PM (#20775211)
    The information he seems to be pulling from was from the early 2000's. Many things have changed since early 2000 lowering the amount of power needed for the average home PC to operate. Most users in early 2000 were using CRT monitors which use almost 3 times as much power than a modern LCD. If I took the time to research 2000-2002 vs components in the last two years I bet you will see the power consumption of average hardware is probably close to half as much.

    • by trolltalk.com ( 1108067 ) on Thursday September 27, 2007 @05:11PM (#20775251) Homepage Journal

      Many things have changed since early 2000 lowering the amount of power needed for the average home PC to operate. Most users in early 2000 were using CRT monitors which use almost 3 times as much power than a modern LCD. If I took the time to research 2000-2002 vs components in the last two years I bet you will see the power consumption of average hardware is probably close to half as much.

      And the average cpu uses a LOT more juice. So does the average video card. Who's buying all those 550 watt PSUs?

      And the average home has more computers in it than it did 5 years ago. Who do you know who has only one computer nowadays?

      • Re: (Score:3, Insightful)

        by jedidiah ( 1196 )
        A 550 watt PSU is like a red sportscar or a phat offroad vehicle.

        The person who buys it may not fully utilize it.

        It just seems "the thing to get".

        Something else to consider is the rise of laptops.
        • by nuzak ( 959558 )
          A 550 watt PSU won't even adequately power a SLI/Crossfire setup. They're selling kilowatt PSU's these days. Hell, I see one outfit selling a 1.6KW PSU. Now that might be overkill.
          • Re: (Score:3, Insightful)

            by tlhIngan ( 30335 )

            A 550 watt PSU won't even adequately power a SLI/Crossfire setup. They're selling kilowatt PSU's these days. Hell, I see one outfit selling a 1.6KW PSU. Now that might be overkill.

            How is that possible? It would mean either the power supply can only supply around 1.3kW, or you're gonna have to hire an electrician to wire in a new 20 amp circuit just for that PC.

            A regular 15 amp service at 110V only gives you 1650 watts of power. A PSU rated at 1600W, at "80+" certification (which so far appears to mean they'

            • by lgw ( 121541 )

              A regular 15 amp service at 110V only gives you 1650 watts of power
              No, for you see wall power is not DC, so it doesn't spend much time at 110V. You can draw maybe 1350W from a 15A wall socket, and that's with a really effective PSU. Also, your household breaker is supposed to trip at 12A (according to the electrical codes I've seen), which means a 1KW power supply is already about the limit of what an "80+" PSU can deliver from a wall socket.
          • by adolf ( 21054 )
            A 550 watt PSU won't even adequately power a SLI/Crossfire setup.

            Ok, point taken.

            So, the underutilized red sports car analog now officially belongs to the 1+ KW SLI camp.

            Non-SLI 550-Watt single-GPU rigs have accordingly been demoted to being the analog of the underutilized blue sports car.

            Thanks for the tip!

          • Most of the >1kW PSUs I've seen haven't made it into the consumer market yet, but those already in there claiming to be >1kW often aren't able to pull that much due to cheap/bad electronic setups (both inside and outside of the unit).
        • Re: (Score:3, Informative)

          by Technician ( 215283 )
          A popular mis conception is a 65 watt laptop power supply draws 65 watts. A 350 watt desktop power supply draws 350 watts. A 550 watt power supply draws 550 watts. These numbers is mostly WRONG. The wattage a power supply draws is equal to the amount of power drawn from the supply plus the loss in conversion (efficiency) of the supply. The wattage stamped on the box is simply the capacity of the supply. A 550 Watt supply is supposed to be able to provide 550 Watts out. If the supply is 90% efficient,
        • Re: (Score:3, Informative)

          That's a poor analogy.

          Power supplies don't just magically use power when they are on. It takes a load (video card, motherboard, cpu, etc) to be drawing that power.

          Now there is a question of efficiency, but that has nothing to do with it's power rating. A 400w power supply with a 60% efficiency rating is going to piss away more electricity in the form of heat that a 550w that is rated at 85% efficiency.

          The el cheapo power supply with the lower rating may not cost much now, but you'll pay it all back

      • Why do you assume that this 550 Watt power supply is constantly drawing 100% power? Usually your computer will use a small fraction of the maximum power it is rated at when your CPU, disks, and video cards are idle (which is pretty much any time you are not playing an FPS or editting a video).

        Also consider that today's power supplies are often >80% efficient, which is probably doubled in the last five years. In addition, Windows now implements CPU Idle functionality which it did not do in Win 98 IIRC, re
        • > "Why do you assume that this 550 Watt power supply is constantly drawing 100% power? "

          I could say "why do you assume that 300 watt power supplies 7 years ago were constantly drawing 100% power?"

          If they're both drawing on average 50%, then power use still has almost doubled.

          > Also consider that today's power supplies are often >80% efficient

          Only the most expensive - most sub-$100 PS are only 25 to 40% efficient.

      • Re: (Score:2, Informative)

        by vio ( 95817 )

        And the average cpu uses a LOT more juice. So does the average video card. Who's buying all those 550 watt PSUs?
        And the average home has more computers in it than it did 5 years ago. Who do you know who has only one computer nowadays?

        Actually, the average CPU nowadays is pretty good at dropping down in power usage when idling (something mostly unheard of in the "mhz race" that characterized the early new millennium). And most people have integrated "video cards" now (ie. built into the motherboard) which use way less power... the 550w PS are for the crazies (extreme minority).

        And lets not forget that the ratio of (power efficient) laptops to computers has increased dramatically over the years...

        But yea, there are more computers than e

      • by tygt ( 792974 )
        Definitely; we have a lot more computers now. I've been a computer professional (software engineer) since the late 80's, and I've typically only had 1-2 computers... till the last 6 years. And, I'm not talking relics that I adore and occasionally boot, I'm talking computers which are on most if not all of the time.

        At full tilt now, between my work computers and those my family use, we've got 1 router (cisco), 4 WAPs (for us and the neighborhood), 4 "desktop/server" types running (2 headless), and my wife'

    • Yes, but I would wager that a good share of those people that were running CRTs 5-7 years ago are still running CRTs at home. Most home users cannot justify the cost of a new monitor unless their old one dies. And CRTs are pretty reliable for the most part. And as others are pointing out, the CPUs are pulling 10x the wattage they were a few years ago along with video cards that require exotic cooling solutions and extra power wires. We are a gluttonous society.
    • by turing_m ( 1030530 ) on Thursday September 27, 2007 @07:08PM (#20776479)
      "lowering the amount of power needed for the average home PC to operate."

      And this will continue to change. People are becoming aware of resource scarcity, and want to insure themselves from rising prices. Witness the rise of cheap power meters such as the Kill-A-Watt. These took years to move over to 240V simply because they couldn't keep up with the demand for 110V items.

      Something like a WRAP uses 5 Watts. Use it as a firewall/router/ADSL modem/traffic shaper, and it's going to be a cheaper and smaller solution than the typical 20+ Watt modem/router box.

      Even CRTs have dropped in power usage compared to what they used to.

      We are rapidly approaching the day when our computers will be fast enough for most tasks, the hard drive will be solid state, the system will be passively cooled and made from reliable parts that will last for decades, drawing minimal power. Any media that won't fit on the solid state hard drive can be stored on the spinning kind and plugged in as needed via USB/eSATA/firewire.

      Intel probably doesn't want us to have these systems. AMD may or may not. Via certainly does, and you can bet that for pretty much everyone in the first world there is a market for several of these type of systems at a $300 price point or so. That may be a reduction in profitability for Intel, but it will be a massive new market for others, and getting easier to enter all the time.
  • by truckaxle ( 883149 ) on Thursday September 27, 2007 @05:09PM (#20775219) Homepage
    This figures.... doesn't the brain use about 30% of the blood oxygen.
  • meh (Score:5, Insightful)

    by Eternauta3k ( 680157 ) on Thursday September 27, 2007 @05:09PM (#20775223) Homepage Journal
    They shouldn't count PCs, they have many more uses than just the internet.
    Also, pirates counter global warming...
    • Re:meh^2 (Score:3, Interesting)

      by redelm ( 54142 )
      Agreed on the PCs, especially those in business settings. Many of those are forbidden or otherwised blocked from the Internet. And would exist and be run without the Internet. Their predecessors were.

      Furthermore, a large fraction of the remaining 1/3rd of power is servers. Many of them would be run even without the internet, most probably as internal servers for 1-800 phone reps.

      The actual power attributable to the Internet is probably quite small. And certainly less than the gasoline and other motor

    • They shouldn't count PCs, they have many more uses than just the internet.

      At night when they are left on and nobody is using them they just act as space heaters. Now consider a building that is running both A/C and is full of idle computers and you have a lot of wasted energy. I know many people say that stand-by use more energy than necessary, but it uses a heck of a lot less than one left on for no reason.
    • Re: (Score:3, Insightful)

      by nine-times ( 778537 )

      Yeah, so they're basically including all computer equipment, not just "the Internet". They're even including servers in datacenters and air conditioning in datacenters.

      So computer equipment uses a decent percentage of all electricity in a civilization where a lot of industry is based on knowledge, entertainment, and other intellectual property, most of which has gone digital. Thanks, captain obvious. Next thing you know, you'll tell me that a large percentage of oil and coal are used in transportation a

    • TFA states that 25% of the power consumed by computers goes toward powering local networking hardware, which is factored at about 20% of the total consumption of the Internet.

      This means that a typical small office with 20 computers has local networking hardware consuming the equivalent of 5 PCs.

      Sources cited in TFA state that each PC uses an average of (588kW/365.25/24*1000) = 67 Watts, which seems reasonable enough. But that (67*5) means that 335 Watts worth of network infrastructure gear are present in a
  • 99.9% (Score:5, Funny)

    by SevenHands ( 984677 ) on Thursday September 27, 2007 @05:09PM (#20775231)
    and 99.9 percent of this 9.4% is a result of pr0n!
    • Re: (Score:2, Funny)

      by Tablizer ( 95088 )
      and 99.9 percent of this 9.4% is a result of pr0n!

      But it makes up for it all by reducing the birth rate.
             
    • Pr0n is the reason for so many computers being switched on, but pr0n will also power North America's electrical grids.

      My name is John Titor, and I'm from the future. In the next five years, a man already known in your time for an innovative invention will stun investors and send panic through energy markets with his Wankamo, a masturbation-powered battery charger that attaches to the forearms of the growing number of desperate North American nerds.

      Using the Wankamo, desperate nerds will attempt to attract w
  • by Tablizer ( 95088 ) on Thursday September 27, 2007 @05:09PM (#20775235) Journal
    Remember the article that more are browsing the web *instead* of watching TV? That would mean that TV power is going to PC's instead. (Except maybe for those who leave both on, and some PC's + monitor take more power than a TV)
    • Re: (Score:3, Insightful)

      by jedidiah ( 1196 )
      What about when I read CNN or Slashdot.org through MythBrowser on my 55 inch projection TV?
    • Computer monitors generally use more power than (standard def) TVs. A 19" CRT monitor will probably use 60watts, while a 19" CRT TV will probably use 40watts.

      The flat-screen monitor trend will no doubt reverse this, eventually, but the disparity is big, and there's a tremendous installed base of CRT monitors out there. Not to mention that flat-screen TVs are slowly catching on.
    • I'm watching as much or more TV than 10 years ago, and spending more time on the Internet. The only thing I'm doing less is sleeping.
    • That would mean that TV power is going to PC's instead.
      That may be true for people who live by themselves. But consider a family of 4 with one computer and one television. They take turns using the computer, while the remaining three watch TV. So each individual's TV time decreases by 1/4, but the total time that the television is on remains the same. Thus the power used by the computer is in addition to that of the television.
  • by Kingrames ( 858416 ) on Thursday September 27, 2007 @05:11PM (#20775255)
    Tubes require no electricity!
  • really? (Score:2, Insightful)

    by xordos ( 631109 )
    From the Article: PCs&Monitors alone use 235b out of the 350b, so it means PC&Monitors will use ~6% US power, something wrong here.
  • So are we going to see desktops switching to a slower VIA-type processor and video card when not running CPU/GPU intensive applications? How about monitors switching from backlit to reflective mode when the built-in camera detects abundant light?
    • by glwtta ( 532858 )
      So are we going to see desktops switching to a slower VIA-type processor and video card when not running CPU/GPU intensive applications?

      Seems like having the same processor draw less power when not under load would be more straightforward. It almost seems like there's something out there that does this already... Some kind of "portable computer" or something crazy like that.

      (but yeah, it'd be nice if that technology found its way into mostly over-powered general purpose desktops)
      • by iamacat ( 583406 )
        Not the same thing. Smaller circuits inside more complex processors leak more power through inadequate insulation. Even if schematics of the simpler processor was replicated inside the more complex one, it would draw more power than a dedicated chip.
  • Ridiculous Units (Score:5, Informative)

    by John Hasler ( 414242 ) on Thursday September 27, 2007 @05:15PM (#20775287) Homepage
    > that's 868 billion kilowatt-hours per year

    That's simply 99 gigawatts. "kilowatt-hours per year" is silly.
    • by dgatwood ( 11270 ) on Thursday September 27, 2007 @05:26PM (#20775455) Homepage Journal

      While you're right that 868 billion kwh/yr. is about 99 gigawatt-hours per hour, or 99 gigawatts continuous, I think it is moderately more understandable to use the more traditional time-based watt-hour units rather than the continuos watt units, as that's what people are used to seeing on their electric bill. I'd have probably described it as 868 terawatt-hours annually, though, and put 868 billion kwh in parentheses.

      • Re: (Score:3, Interesting)

        by DJ Rubbie ( 621940 )
        Or according to Einstein (and Google):

        868 billion kilowatt hours = 3.1248 × 1018 joules
        (3.1248 × (10 ** 18) joules) / (c ** 2) = 34.768089 kilograms

        So keeping the current Internet running requires turning nearly 35 kilograms of mass into electricity.
        • So keeping the current Internet running requires turning nearly 35 kilograms of mass into electricity.
          Mostly hot pockets and jolt, no doubt. You might want to cite a time frame, by the way; I doubt you mean 35 kg for the remainder of eternity.
    • Yes. It is, but unfortunately electricity is sold by the kilowatt-hour in the USA. I'd prefer the joule, myself, and people should measure power in watts, but alas we're stuck with the "my car gets fourteen rods to the hogshead and that's the way I like it" mentality. Even if you're hung up on the old miles/lbs/ergs/horsepower system units like kilowatt-hours per year should seem pretty stupid.
      • Kilowatt-hours are convenient units for the most common calculation you want to perform, which is 'how much does it cost to run this appliance, rated at nW for one hour?' If you started with dollars per joule, you would have an extra factor of 3600 to worry about, which makes it much harder to do this kind of calculation in your head. If it costs you 10/kWh, you know it costs 1 to run a 100W light bulb for an hour, 30 to run a 3KW fan heater, etc. If you know it costs 2.78/kJ the calculation is a lot har
    • > that's 868 billion kilowatt-hours per year

      That's simply 99 gigawatts. "kilowatt-hours per year" is silly.
      I guess you also don't say your car goes 70 miles per gallon, but your car goes 105 per acre, right?
      (I hope I got that calculation right; those US units are a true nightmare ...)
    • "kilowatt-hours per year" is silly.

      Not sillier than "miles per gallon". Why do I say my car gets 15 kilometers/liter, instead of saying it gets 1500(square centimeter)**(-2)? Because what I need is a practical way to determine how much fuel I need for a given trip, not a theoretical number.

      When you mention "kilowatt-hours per year" you get kind of information that's different from simple kilowatts. Power consumption is not uniform, to supply 8760 kilowatt-hours per year you need more generation capacity on

  • However, we would still have 20 office computers to run our accounting system with or without the internet. Maybe the 4 remotes would be 1 with old-school leased lines. They could be older/smaller if i didn't need so much power to run firewall/AV underneath tho.

    While new machines can suck the juice the previous one had a 2000 watt disk drive.

    Here at least the internet only added a couple wall-warts and an extra GHz on the CPUs.

    At home, there are 2 decent computers pretty much online only. That is about half
  • Blame Game (Score:5, Funny)

    by Anonymous Coward on Thursday September 27, 2007 @05:17PM (#20775325)
    It's Al Gore's fault.
  • by lobiusmoop ( 305328 ) on Thursday September 27, 2007 @05:17PM (#20775327) Homepage
    This is why I think the OLPC project shouldn't be limited to third world countries. These laptops run on only a couple of watts! If more first-world computer users used them for basic surfing instead of 200 watt gaming rigs, much energy/CO2/fossil fuel could be saved I think.
    • Then... what am I going to do with this quad processor machine with dual Geforce 8800 GTX SLI cards and a few TB of storage?

      I know, i'll trade it in for 50 of these laptops and build a beowulf cluster, super!

      Ahahaha. Oh you were serious?
    • by KKlaus ( 1012919 ) on Thursday September 27, 2007 @07:01PM (#20776413)
      Would it? Reminds me of the finding that hybrid cars didn't cut down on fossil fuel consumption as much as many people thought because since they were more efficient, people drove them more. I don't doubt that the computers themselves would use less energy, but I suspect people might then use some of the money that they save from their laptop and use it to keep the house cooler/warmer or whatever.

      Not that that would be a bad thing of course, but since people already tend to moderate their electricity usage to what they can (or want) to afford, lowering use in one area must simply see it transfered to another - rather than reducing overall consumption.

      Cheers.
  • So everything in my building is support for the Internet? Seems like I might use the AC for something else than cooling a PC that has net access.
    • by ceoyoyo ( 59147 )
      No way. I know I only have a furnace because the specs on the computer say not to operate it below -20.
  • The total includes the energy used by desktop computers and monitors (which makes up two-thirds of the total)

    So "The Internet" makes up 3.13%, not 9.4%

    The other 6.27% is from desktop computers. Which may or may not be doing "internet stuff" at any moment in time. Lumping all desktop machines into the count is disingenuous.

    It's still a bigger number than I would have thought. And it is a bit of an eye opener to realize how much power all those PCs are using up.

  • Don't forget (Score:3, Insightful)

    by Paul Carver ( 4555 ) on Thursday September 27, 2007 @05:25PM (#20775441)
    Don't forget the vacuum cleaners used to clean the carpets in the buildings where the network designers and operators work, or the stereos that play music while people are browsing the net, or the electric lights that let the non-touch-typists see their keyboards at night.

    Come on, unless they're somehow able to measure electricity used only while a computer user is actively viewing Internet content it's absurd to count desktop computers in the total. Or, alternatively, it's absurd to attribute the electricity usage to "the Internet". It would be valid to estimate the electricity usage of computers and/or data communications equipment, but to try to pin a number on "the Internet" and include multifunction equipment that serves non-Internet functions is just sloppy.

    Come to think of it, there are probably lots of FT-2000s that carry some Internet circuits and some PSTN circuits, how do they account for that? What about the 5Es and DMSs that are carrying modem calls? Do they accurately attribute the percentage of the switch's electrical usage based on the percentage of modem vs voice calls?
  • by redefinescience ( 983582 ) on Thursday September 27, 2007 @05:47PM (#20775703) Homepage
    I wonder how much energy is actually SAVED because of the internet, quick example: email. How much energy is used shipping a letter across the country?
    • My sentiments exactly. What are the savings in reduction of transportation costs related to distribution and acquisition of information? It's probably unquantifiable; the usage amount seems meaningless without this other half of the equation.
    • How much energy is spend delivering fake moon rocks, Star Trek sets, and other must-have items purchased from eBay?
    • Re: (Score:3, Funny)

      Internet use has cut my heating bill in the winter by 75%. My two computers keep my house toasty and warm.
  • The flip side (Score:5, Interesting)

    by femto ( 459605 ) on Thursday September 27, 2007 @05:48PM (#20775715) Homepage

    It would also be interesting to know how much energy the Internet saves. For example instead of people flying around they talk on VoIP or have a teleconference. Documents are emailed rather than having to be flown around the world. Music and movies are downloaded rather than people driving to the shops for a disk. Or is the Internet is promoting long distance relationships that otherwise just would not be?

    The numbers do suggest that electronic equipment needs to be more efficient.

    • Newspapers &c. (Score:3, Insightful)

      by wytcld ( 179112 )
      Consider the rapid decline of newspapers - the hard copy as compared to online editions. This results in less energy-intensive and habitat-destructive logging on the one end, less fuel-burning distribution in the middle, and less waste paper to discard or recycle on the other end.

      Or consider the decline of the secretarial profession. Thirty years ago every junior executive on up had his or her own secretary. Now all they get is a laptop. It takes much more energy to feed a secretary than a loptop (although
    • Re: (Score:3, Insightful)

      by suv4x4 ( 956391 )
      The numbers do suggest that electronic equipment needs to be more efficient.

      You don't know that before seeing the full pie chart. How much do other common tasks and equipment fair on this scale?

      Internet and Desktop PC-s perform thousands of roles crucial for our daily lives, given how many millions of computers and Internet end-points operate, and how many uses those have, 9% is certainly not that much. We'd definitely have worse carbon emissions if it wasn't for the remote data transmission Internet allows
  • "Excellent, excellent..." C. Montgomery Burns
  • Coralized link [nyud.net] for those who wish to read TFA.

  • "Equipment powering the internet accounts for 9.4% of electricity demand in the U.S."

    That doesn't seem to pass the initial sniff test. I know that on the consumer end, it's nowhere near that amount. And on the business end, at least from what I'm familiar with, the percentage is still lower than that. Sure, various ISPs, Google, and other places may drive it up, but still...

    It looks like they are assuming that if a PC is connected to the internet, that all electricity consumed by that pc, monitor,
  • I'm sure it's more (Score:3, Informative)

    by njfuzzy ( 734116 ) <`moc.x-nai' `ta' `nai'> on Thursday September 27, 2007 @06:01PM (#20775871) Homepage
    If you are including every device contected to the Internet, then surely it is more than that. The vending machine in my building is on the Internet. My phone is on the Internet. My laser printer is on the Internet, and in a way, I believe my cable box is too. Between infrastructure, servers, telecommunications, and end systems, a huge fraction of the electricity-using devices we interact with are on the Net.
  • by viking80 ( 697716 ) on Thursday September 27, 2007 @06:01PM (#20775873) Journal
    9.4% is probably way off, but here are some conversions/comparisons anyway:
    868 billion kilowatt-hours per year = 10^11W=100GW
    Space shuttle liftoff: 100GW
  • I suddenly feel really bad for having a 300 watt CRT. But it does power down most of the time, so that makes up for it, right?
  • So, when im not online and instead working on a document, they consider this 'useage' ?

    What if im multitasking? 100% of my power usage isnt going to view that webpage or email, its a small percentage.
  • by linuxwrangler ( 582055 ) on Thursday September 27, 2007 @06:16PM (#20776015)
    Trouble is that leaving computers running is arguably a rational business tradeoff. If a desktop computer draws 250 watts (and most don't average that high), and is left on during all non-business hours (assume one works only an 8-hour day and no weekends) that is 128 hours or 32 kWh or, at $0.10/kWh, $3.20.

    If your entire employee cost (pay, bonus, worker's comp, medical, office-space, etc.) is only $60,000/year, an employee needs to save less than 10-minutes/week to break even.

    One coder measured his own pretty high-end machine (including support for 3 monitors) at less than 140 watts when not doing heavy processing. This doesn't include the monitor which in most systems sleeps after a short period anyway. If we use 150 watts, a 9 hour day, and $100,000 employee cost then break-even happens by the time you have saved 2 minutes 15 seconds per week or less than 30 seconds per day.

    Now if it takes 2 watts cooling per watt of usage then the benefits of shutting down are greater. But on the other hand, none of the office buildings where I've worked have metered power or cooling (except for custom auxiliary units) so from the tenant perspective, leaving the machine running has no impact on power or cooling costs.

    Sure, for many, waiting for a computer to boot is part of the morning routine and provides an excuse to go fill the coffee cup. But if buildings metered power and cooling usage and if computers were made to save-state and swich off and back on like a light - or at least in just 1-2 seconds - people would be much more willing to power down not only at night but at lunch and whenever they aren't using the machine.
  • "The total includes the energy used by desktop computers and monitors" you say. long before internet we were using computers.
  • by WillAffleckUW ( 858324 ) on Thursday September 27, 2007 @06:50PM (#20776339) Homepage Journal
    Instead of spending so much to cool them down, we could set up efficient cooling arrays, or even use the heat to store energy in biomass or fuel cells instead.

    The problem is that we are unwilling to revisit the basic design concepts.

    Why should a "desktop" computer crank out so much heat? My son's Mac Mini doesn't. His next computer won't either.

    There are better ways to do this.

    Besides, most of our energy use is for: lights (could use LED lighting for 1/20 the energy), washers (heating up all that water), and dryers (if we only got rid of those covenants that didn't let people line dry clothes), and machines that aren't even being used - look at that printer in the office, it's on 24/7 but after office hours, who is printing to it?

    For that matter, why are our gigapop Internet networks running 24/7 in most places? Couldn't we have master switches and routers with key servers that were on 24/7, and have the "desktops" turn OFF their monitors and even computers when no one was using it? Turn off LAN segments that aren't in use automagically.

Someday somebody has got to decide whether the typewriter is the machine, or the person who operates it.

Working...