AC and DC Battle For Data Center Efficiency Crown 168
jfruh writes "AC beat DC in the War of the Currents that raged in the late 19th century, which means that most modern data centers today run on AC power. But as cloud computing demands and rising energy prices force providers to squeeze every ounce of efficiency out of their data centers, DC is getting another look."
Re:Makes sense. (Score:4, Insightful)
As opposed to the transformer coming into your building? How about the UPS and HVAC units supporting your server room?
Obviously, you'll have redundant DC power supplies, just like you do now. Except instead of having two AC->DC power supplies per PC, you'll route two room-level DC power supplies to each machine in the room. Lots of little, less efficient, lower quality power supplies replaced by a pair of high quality, high efficiency supplies.
Slashad (Score:4, Insightful)
Re:Makes sense. (Score:4, Insightful)
I'm more concerned that I convert AC to DC to charge a battery, then convert it back to AC to power a power supply in my machine that outputs DC voltage. (Or, taking the DC battery output and inverting it to AC to run a computer.) Why can't I just run my PC off a battery that's kept charged by a DC current from a single power supply? I mean, I don't need the efficiency of AC for long distance transfer (we're talking maybe 3 feet) so why convert it back to AC?