Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Power Hardware Technology

University Switches To DC Workstations 468

An anonymous reader writes "Researchers at the University of Bath, UK are undertaking an in-depth study of energy consumption within the new network, with the aim of demonstrating that running a large network of devices on DC rather than AC is both more secure and more energy efficient. AC electric power from the grid is converted to DC and runs 50 specially adapted computers in the University Library. Students using the system have noticed that the new computers are more compact and much quieter than the previous systems. The immediate advantages of the new system are not only for the user but for the energy bill payer and the environment."
This discussion has been archived. No new comments can be posted.

University Switches To DC Workstations

Comments Filter:
  • AC? (Score:5, Interesting)

    by Anonymous Coward on Tuesday March 22, 2011 @12:34PM (#35574628)

    The only thing inside a computer that actually runs on AC is the computer's powersupply. The powersupply regulates this to DC voltages! The powersupply is also quite bulky and noisy compared to the other components.

    "Initial tests show that the system in Bath emits approximately half as much energy as heat than the previous AC powered system while running much faster."

    Yes, I'm sure it'll generate less heat when most of that heat comes from converting AC to DC, but why the hell would it run faster when everything else in the computer is still the same?

  • by rwade ( 131726 ) on Tuesday March 22, 2011 @12:34PM (#35574646)

    Selective quotes from TFA [bath.ac.uk]:

    Researchers at the University are undertaking an in-depth study of energy consumption within the new network, with the aim of demonstrating that running a large network of devices on DC rather than AC is both more secure and more energy efficient.

    The new DC network also offers greater security. DC power supply units have a simpler design, with fewer parts that could fail and need replacing. The system at the University also charges a number of batteries when usage levels are low to allow the system to run independently from the grid for up to eight hours should a cut in power be experienced.

    The above two paragraphs are the only I could find in TFA that mention security. I gotta ask -- can anyone speculate how centralizing the PSU would lead to a more secure system? Is it possible that there is a regional definition of "secure" to mean "very reliabile" or "very available." As in, we have "secured" a constant municipal water supply?

  • Re:So... what? (Score:5, Interesting)

    by ShakaUVM ( 157947 ) on Tuesday March 22, 2011 @12:49PM (#35574922) Homepage Journal

    Edison: "Genius is one percent inspiration and ninety-nine percent perspiration."

    Tesla: "If Edison had a needle to find in a haystack, he would proceed at once with the diligence of the bee to examine straw after straw until he found the object of his search. I was a sorry witness of such doings, knowing that a little theory and calculation would have saved him ninety percent of his labor."

  • by KenSeymour ( 81018 ) on Tuesday March 22, 2011 @02:52PM (#35577092)

    When I first had to deal with telephone equipment, I came across the -48 VDC power standard for things like SONET nodes, digital cross connects, channel banks, and telephone switches. I believe this is due to cathodic protection [wikimedia.org] of buried copper cables.

    You can find -48 VDC rectifiers, AB fuse panels (think redundant DC power supplies) and lots of telecom gear in racks that is powered with -48 VDC.

  • Re:So... what? (Score:4, Interesting)

    by Grishnakh ( 216268 ) on Tuesday March 22, 2011 @05:47PM (#35579616)

    AC is still the prime motive force in electrical generation and always will be.

    No, not necessarily.

    We've already started moving away from AC for long-distance power transmission, using "HVDC" instead for things like 2MV transmission lines.

    The main advantage of AC is that, with no semiconductor technology available, you can easily step it up and down between different voltages using an iron-core transformer, nothing more than a bunch of iron and some copper wire wrapped around it. High voltage is absolutely necessary for power transmission, because I^2*R losses are too high at lower voltages, but high voltage isn't usable by end-users because of safety and other concerns.

    Nowadays, with power electronics (giant power transistors capable of handling thousands of volts and amps) and high-frequency switch-mode power conversion, that stuff is mainly obsolete, so it's fully possible to eliminate AC for power transmission, and even get better conversion efficiency than transformers. The only reason it's really still used is 1) our infrastructure already uses AC, so you can only replace it in certain places where it won't be too disruptive (like long-distance links), and 2) iron-core transformers are still much cheaper than electronic alternatives, so it's only economically feasible to switch to DC for certain large-scale projects, not for every transformer in a subdivision.

    There's no technical reason that, in the future, DC couldn't become the standard, with electronic "transformers" stepping the voltage up and down as necessary.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...